Latest Entries »


The cries for help from frazzled Mac owners whose Wi-Fi connections went haywire after upgrading to OS X Yosemite are being met by Apple with stone-faced silence.

Affected users have been filing a steady stream of complaints about the problem in discussion forums, blogs and social media sites since Apple released the latest version of the operating system a week ago.

Attempts by users to isolate the cause of the issue have been fruitless so far. The problem affects a variety of Macs with dissimilar configurations and linked to many different routers. What’s clear is that the problem hit these users after installing Yosemite. In most cases, Wi-Fi becomes unstable, with connections dropping every few minutes, irritatingly slow or simply unusable.

The lucky few people who have managed to get their Wi-Fi working properly again have done so with one of at least 20 unique and unofficial “fixes” scattered among thousands of discussion forum postings. None of them seems to be a universal fix for the problem.

On Friday morning, a user identified as “Hevelius” in a Mac Rumors forum vented his frustration with the situation. “There must be about two dozen so-called fixes now on this forum. I’ve tried every single one of them and none of them work,” this person wrote, adding that until there is a fix that works for everyone, the best option is to revert to Mavericks, the previous version of the OS.

Some of the most active forum threads about this topic are in the official Apple Support site, including the ones titled “OSX Yosemite Wifi issues” with almost 400 comments and close to 34,000 views; “Yosemite (OS X 10.10) killed my WiFi :(” with almost 150 comments and 18,000 views; and “wifi keeps dropping since Yosemite upgrade”, which is approaching 100 comments and 6,000 views.

Even technologists are among the affected, including Eugene Wei, Flipboard’s head of product, who tweeted on Friday: “After upgrade to Yosemite, my MBP [MacBook Pro] drops my wifi network at home seemingly every 20 seconds.”

Sophos’ security expert Paul Ducklin took to the company’s Naked Security blog on Wednesday to request help from its readers in troubleshooting the problem and figuring out workarounds.

In his post, which has generated 84 comments, Ducklin refrains from convicting Apple, saying that while the cause could be a Yosemite bug, it could also be due to latent flaws in third-party hardware and software exposed by the upgrade. However, he does acknowledge that, whatever the cause, the common trigger is a Yosemite installation.

“No one seems to know what’s wrong, and without a scientific explanation it’s hard to know where to lay the blame,” wrote Ducklin, whose Wi-Fi connection works fine for no longer than 10 minutes at a time, and then starts to melt down. He wrote a script that automates the manual process of turning his Wi-Fi off and then on after a connection disruption is detected.

Apple hasn’t responded to multiple requests for comment from the IDG News Service.

Source from : (itnews.com)


The auction intended to turn many U.S. TV channels into spectrum for mobile services won’t start until early 2016, the U.S. Federal Communications Commission says.

FCC Chairman Tom Wheeler had earlier forecast that the auction would begin in the middle of next year. But in August, the National Association of Broadcasters challenged some aspects of the agency’s plan in court. Due to the schedule for briefings and hearings in that case, plus the complexity of putting together the auction, the agency has pushed back its calendar. It now expects to start accepting applications in fall 2015 and to launch the auction early the following year, according to an FCC blog post on Friday.

In the so-called incentive auction, the FCC will ask the owners of TV stations to switch to different frequencies, freeing up highly desirable spectrum in the 600MHz band for mobile carriers and other users. Part of the billions that mobile operators would pay for that spectrum would go back to the station owners. But many broadcasters have resisted the plan, and Wheeler has compared the process of constructing the unprecedented auction to solving a Rubik’s Cube.

CTIA, the main industry group for U.S. mobile operators, said Friday that its members are ready and willing to bid in the now-delayed auction.

“Today’s action underscores the need to resolve the pending litigation over the FCC’s rules expeditiously. When the auction is held, mobile companies will have their checkbooks ready to participate in this critical auction that will be key to our nation’s wireless future,” CTIA Vice President of Regulatory Affairs Scott Bergmann said in a prepared statement.

Congress called for the auction in 2012, expecting about US$29 billion in proceeds that would be used partly to reduce the federal deficit and partly to fund a national LTE public-safety network. It’s one of the biggest parts of an FCC plan to make more spectrum available to carry growing amounts of mobile data traffic. The 600MHz band is ideal for mobile services because signals there go farther and penetrate walls better than higher frequencies can.

Source from : (itnews.com)


Sen. Orrin Hatch (R-Utah) this week outlined the Republican tech agenda for the next Congress, and took a position that puts him at odds with some in his own party.

Hatch, in a speech at the corporate offices of Overstock.com in Salt Lake City, called for raising the cap on H-1B visas. “Our high-skilled worker shortage has become a crisis,” said Hatch, who heads the Senate Republican High-Tech Task Force.

To support the idea of a skilled-worker shortage, Hatch cited the high demand for H-1B visas. There were 172,500 petitions this year for the 85,000 visas available under the cap, he said.

“American companies were thus unable to hire nearly 90,000 high-skilled workers they need to help grow their domestic businesses, develop innovative technologies, and compete with international competitors,” said Hatch, according to the prepared text of his remarks.

In sketching out his views on tech legislation, Hatch also talked about the need for patent litigation reform, updates to privacy protections, and incentives to businesses to encourage sharing of cyber-threat information. One incentive could be a form of liability protection.

But by arguing that H-1B demand is evidence of a high-tech worker shortage, Hatch is taking a position that runs counter to fellow Republicans, namely Jeff Sessions (R-Ala.) and Sen. Chuck Grassley (R-Iowa).

In a speech on the House floor this summer, Sessions argued that there was no evidence of a shortage of skilled workers, and said many STEM-trained workers struggle to find employment.

Grassley, who will become Senate Judiciary Committee chairman with oversight of immigration policy if Republicans re-take the Senate next month, has been critical of the use of H-1B visas by offshore outsourcing firms. These firms are the major users of the visa. In 2007, Grassley said: “Unfortunately, the H-1B program is so popular that it’s now replacing the U.S. labor force.”

By giving this speech just before the Nov. 4 election, Hatch is signaling to the high-tech industry what it can expect if the Republicans win a Senate majority.

Hatch wants the Senate to take up the high-skill immigration issue separate from broader comprehensive immigration reform to improve its odds of passage. The Senate last year did approve a comprehensive immigration bill that would allow the cap to increase to 180,000, but it was tied to comprehensive reform. The House of Representatives did not take the issue up.

If the Republicans take the Senate, they would still need Democratic support to get an H-1B cap increase, and Democratic leaders may still want to tie that to a comprehensive agreement.

Source from : (itnews.com)


It turns out that a vital missing ingredient in the long-sought after goal of getting machines to think like humans — artificial intelligence — has been lots and lots of data.

Last week, at the O’Reilly Strata + Hadoop World Conference in New York, Salesforce.com’s head of artificial intelligence, Beau Cronin, asserted that AI has gotten a shot in the arm from the big data movement. “Deep learning on its own, done in academia, doesn’t have the [same] impact as when it is brought into Google, scaled and built into a new product,” Cronin said.

In the week since Cronin’s talk, we saw a whole slew of companies — startups mostly — come out of stealth mode to offer new ways of analyzing big data, using machine learning, natural language recognition and other AI techniques that those researchers have been developing for decades.

One such startup, Cognitive Scale, applies IBM Watson-like learning capabilities to draw insights from vast amount of what it calls “dark data,” buried either in the Web — Yelp reviews, online photos, discussion forums — or on the company network, such as employee and payroll files, noted KM World.

Cognitive Scale offers a set of APIs (application programming interfaces) that businesses can use to tap into cognitive-based capabilities designed to improve search and analysis jobs running on cloud services such as IBM’s Bluemix, detailed the Programmable Web.

Cognitive Scale was founded by Matt Sanchez, who headed up IBM’s Watson Labs, helping bring to market some of the first e-commerce applications based on the Jeopardy-winning Watson technology, pointed out CRN.

Sanchez, now chief technology officer for Cognitive Scale, is not the only Watson alumnus who has gone on to commercialize cognitive technologies.

Alert reader Gabrielle Sanchez pointed out that another Watson ex-alum, engineer Pete Bouchard, recently joined the team of another cognitive computing startup Zintera as the chief innovation office. Sanchez, who studied cognitive computing in college, found a demonstration of the company’s “deep learning” cognitive computing platform to be “pretty impressive.”

AI-based deep learning with big data was certainly on the mind of senior Google executives. This week the company snapped up two Oxford University technology spin-off companies that focus on deep learning, Dark Blue Labs and Vision Factory.

The teams will work on image recognition and natural language understanding, Sharon Gaudin reported in Computerworld.

Sumo Logic has found a way to apply machine learning to large amounts machine data. An update to its analysis platform now allows the software to pinpoint casual relationships within sets of data, Inside Big Data concluded.

A company could, for instance, use the Sumo Logic cloud service to analyze log data to troubleshoot a faulty application, for instance.

While companies such as Splunk have long offered search engines for machine data, Sumo Logic moves that technology a step forward, the company claimed.

“The trouble with search is that you need to know what you are searching for. If you don’t know everything about your data, you can’t by definition, search for it. Machine learning became a fundamental part of how we uncover interesting patterns and anomalies in data,” explained Sumo Logic chief marketing officer Sanjay Sarathy, in an interview.

For instance, the company, which processes about 5 petabytes of customer data each day, can recognize similar queries across different users, and suggest possible queries and dashboards that others with similar setups have found useful.

“Crowd-sourcing intelligence around different infrastructure items is something you can only do as a native cloud service,” Sarathy said.

With Sumo Logic, an e-commerce company could ensure that each transaction conducted on its site takes no longer than three seconds to occur. If the response time is lengthier, then an administrator can pinpoint where the holdup is occurring in the transactional flow.

One existing Sumo Logic customer, fashion retailer Tobi, plans to use the new capabilities to better understand how its customers interact with its website.

One-upping IBM on the name game is DataRPM, which crowned its own big data-crunching natural language query engine Sherlock (named after Sherlock Holmes who, after all, employed Watson to execute his menial tasks).

Sherlock is unique in that it can automatically create models of large data sets. Having a model of a data set can help users pull together information more quickly, because the model describes what the data is about, explained DataRPM CEO Sundeep Sanghavi.

DataRPM can analyze a staggeringly wide array of structured, semi-structured and unstructured data sources. “We’ll connect to anything and everything,” Sanghavi said.

The service company can then look for ways that different data sets could be combined to provide more insight.

“We believe that data warehousing is where data goes to die. Big data is not just about size, but also about how many different sources of data you are processing, and how fast you can process that data,” Sanghavi said, in an interview.

For instance, Sherlock can pull together different sources of data and respond with a visualization to a query such as “What was our revenue for last year, based on geography?” The system can even suggest other possible queries as well.

Sherlock has a few advantages over Watson, Sanghavi claimed. The training period is not as long, and the software can be run on-premise, rather than as a cloud service from IBM, for those shops that want to keep their computations in-house. “We’re far more affordable than Watson,” Sanghavi said.

Initially, DataRPM is marketing to the finance, telecommunications, manufacturing, transportation and retail sectors.

One company that certainly does not think data warehousing is going to die is a recently unstealth’ed startup run by Bob Muglia, called Snowflake Computing.

Publicly launched this week, Snowflake aims “to do for the data warehouse what Salesforce did for CRM — transforming the product from a piece of infrastructure that has to be maintained by IT into a service operated entirely by the provider,” wrote Jon Gold at Network World.

Founded in 2012, the company brought in Muglia earlier this year to run the business. Muglia was the head of Microsoft’s server and tools division, and later, head of the software unit at Juniper Networks.

While Snowflake could offer its software as a product, it chooses to do so as a service, noted Timothy Prickett Morgan at Enterprise Tech.

“Sometime either this year or next year, we will see more data being created in the cloud than in an on-premises environment,” Muglia told Morgan. “Because the data is being created in the cloud, analysis of that data in the cloud is very appropriate.”

Source from : (itnews.com)


A partnership that lets Wi-Fi users get on free public networks in San Francisco and San Jose, California, with a one-time joining process now also covers a hotspot along the River Thames in London.

The cities at either end of Silicon Valley used the Wi-Fi Alliance’s Passpoint specification to set up Wi-Fi roaming between their city-owned networks earlier this year. The technology lets residents and visitors set up a secure connection with either network and then automatically get on the other city’s system whenever they enter its coverage area.

It’s an arrangement that makes a lot of sense between the two cities: They’re both home to major tech companies and are commuting distance apart. Adding in a river halfway around the world may seem like a stretch, but for travelers, the easy access to Wi-Fi across borders could be a nice convenience — and a sign of things to come.

The Thames network spans 44 kilometers (27 miles) of riverfront in the London area, with access points both along the shore and on ferries. Access to the network is included with free Wi-Fi that’s bundled with broadband plans from carrier BT.

Passpoint is a standard for automating and securing most aspects of getting onto Wi-Fi networks. It can eliminate the need to enter a username or password to join a Passpoint Wi-Fi network, even the first time you get on. To join a network initially, users only have to use a one-time provisioning file. After that, they automatically get on that network and on those of all roaming partners.

A second release of the specification, introduced this month, is designed to make the initial joining process even simpler and more secure. There are more than 700 devices and infrastructure products certified for Passpoint, including iOS and Android devices.

Backers of Passpoint envision consumers moving from one Wi-Fi network to another wherever they go, in the same way they automatically roam among cellular carriers today. But Wi-Fi network operators are just beginning to activate the technology, which can require new or modified infrastructure. Worldwide, there are 12 live commercial deployments of the underlying technology, called Next Generation Hotspot, according to the Wireless Broadband Alliance.

The grouping of the Thames network with San Francisco’s and San Jose’s is no accident. All three networks use infrastructure from Ruckus Wireless and back-end technology from Global Reach, a Wi-Fi software and services company in London. But with Passpoint, networks that want to offer roaming among them don’t all have to use the same vendors, according to the standard’s supporters. San Francisco CIO Miguel Gamino says the city was approached by people from the Thames network and that other municipalities have also asked about joining in, Gamino said.

Using Passpoint across oceans and borders is no great technical feat, said Farpoint Group analyst Craig Mathias. But that doesn’t mean universal free Wi-Fi will take the place of expensive cellular roaming.

“If we wanted to connect the whole world into Passpoint, we could,” Mathias said. But where the technology is the same everywhere, the way services are packaged and paid for will vary, he said. Carriers, network operators and hotspot aggregators will continue to compete as they do now, sometimes with free access and sometimes with paid. “You’ll see all kinds of hybrids,” Mathias said.

One limitation of the roaming setup is that the fully streamlined user experience is only available on Apple iOS and OSX devices, according to Ruckus. Users can easily go from one network to another on some Samsung devices, too, but only those with SIM cards, said David Wright, technical director for Ruckus’s carrier business. With non-SIM devices such as tablets, the process is more complicated. That’s because OS support is still limited: Passpoint is included in iOS but not yet in Android or Windows, Wright said.

Source from : (itnews.com)


The blame game between Oracle and the state of Oregon is going into overtime, even before their dueling lawsuits over the disastrous Cover Oregon health insurance exchange website make it into court.

Cover Oregon went live on Oct. 1 last year, and like the federal Healthcare.gov site that’s a centerpiece of President Barack Obama’s health care reform legislation, immediately ran into major performance problems. Unlike Healthcare.gov, Cover Oregon never reached full functionality.

Both sides have flung lawsuits at one another, with Oracle seeking millions in unpaid fees and the state hoping to claw back whatever taxpayer money it can from the company.

While Oregon is now using Healthcare.gov to enroll people in private insurance plans, it also planned to salvage some of the Oracle technology created by the US$240 million project and use it for a Medicaid enrollment system.

Not any more. The plan now is to scrap Oracle’s work entirely and use a Medicaid system developed by an as-yet unchosen state, according to a report this week in the Oregonian newspaper. Deloitte had been hired by Oregon to work on the transition and is continuing in that role.

A couple of factors weighed into the state’s decision to fully part ways with Oracle, according to the Oregonian’s report. For one thing, Oracle wouldn’t approve a plan by the state to turn over hardware from data centers it has in Texas and Utah to Oregon. It was also determined that Oracle’s technology wouldn’t end up working well with other IT assets the state has, the report said.

But in a letter to an attorney for Oregon, which was seen by IDG News Service, Oracle general counsel Dorian Daley tells quite a different story.

First, Cover Oregon asked Oracle to continue running the Medicaid system in a production environment only, while eliminating the pre-production and test environment, according to Daley.

“This would mean that in the event of any required patches or updates, including critical security updates, the changes would have to be entered into the production environment directly, without testing,” Daley wrote. “This is not a responsible option and Oracle will not run a system under those circumstances.”

Without a pre-production environment as backup, any outages could result in data losses and significant downtime, Daley added.

Moreover, a backup system gives administrators the ability to check the impact of patches and other changes on the application before they are applied to the production environment. It’s common for this sort of testing to “uncover unintended deleterious consequences to the application that must be repaired and retested,” she wrote.

Oregon CIO Alex Pettit told Oracle in a letter that Cover Oregon would like to “recover possession” of the Exadata servers being used for the production environment, according to Daley.

While not stated explicitly, Pettit’s letter suggested that the state wants to use the servers now running the pre-production environment and use them to run the production environment needed for what Deloitte is doing, Daley added. Oregon “evidently did not budget for the new servers it now requires,” she wrote.

After Oracle made its objections to Oregon’s requests, Cover Oregon made a number of alternative proposals, “none of which are acceptable,” Daley said. Among these were suggestions that Oracle move the system off Exadata and onto commodity hardware, which isn’t technically viable since the same configurations aren’t possible, she wrote.

Cover Oregon also proposed running the pre-production environment “on hardware already leased and used by the state,” which would be a violation of Oracle’s privacy and data security policies, Daley added.

Oregon has some options, which include having Deloitte manage the production environment instead of Oracle, or buying new hardware for it from other vendors, Daley said. Oracle is also willing to discuss how to lower Oregon’s costs for Oracle’s management services, and even to terminate its services contract early so the state can find another provider, she added.

A Cover Oregon spokeswoman didn’t respond to a request for comment on Daley’s letter, which wasn’t mentioned in the Oregonian’s report.

“The State’s apparent decision to now move off the system entirely is puzzling and will cost the state even more time and money, but that, too, is par for the course,” Oracle spokeswoman Deborah Hellinger said via email. “It seems every decision the State makes is politically motivated as evidenced by the fact that the State’s legal counsel is apparently briefing the press before briefing Oracle.”

Source from : (itnews.com)


Microsoft reported a 25 percent increase in revenue from the same period a year ago as some of its key businesses, including Office 365 and the Xbox, delivered sharp gains.

Overall, Microsoft reported revenue of $23.2 billion, an increase of 25 percent from a year ago. Microsoft’s net income slipped to $4.54 billion, down 13.4 percent from $5.2 billion a year ago. However, those results also included $1.14 billion in restructuring costs attributed to the integration of Nokia.

“We are innovating faster, engaging more deeply across the industry, and putting our customers at the center of everything we do, all of which positions Microsoft for future growth,” said Satya Nadella, chief executive officer of Microsoft, in a statement. “Our teams are delivering on our core focus of reinventing productivity and creating platforms that empower every individual and organization.”

Highlights of the quarter included the Devices and Consumer business, which grew 47 percent to $10.96 billion. That business includes the Surface line, which recorded $908 million in total and is closing in on $1 billion per quarter, even as the Surface Pro 2 has quietly disappeared from its product offering. And Xbox console sales–including both the Xbox One and Xbox 360–more than doubled.

Microsoft said that its Nokia phone revenue totaled $2.6 billion, on top of $478 million of gross margin.

Commercial revenue grew 10 percent to $12.28 billion, Microsoft said. Windows volume licensing revenue increased 10 percent, it added.

Source from : (itnews.com)


The launch this week of Apple Pay is giving many people their first taste of NFC payment technology, which allows them to buy things in a store by bringing an iPhone 6 close to a compatible terminal.

But the NFC standard also allows payments to be made directly between smartphones. Apple and its competitors, such as Google Wallet, don’t offer support for that part of the standard, but the technology already exists inside many of today’s NFC-equipped phones and could one day allow retailers to accept NFC payment using smartphones. It would also be technically possible for individuals to exchange money with their friends through the same system.

The NFC standard defines three modes for the technology.

The first, and most simple, is for reading small snippets of information such as phone numbers or Web addresses from wireless tags. The second, used by Apple Pay and Google Wallet, involves the phone emulating an NFC payment card. The third allows the NFC chip to function as a card reader.

It’s this third mode, if enabled by Apple or Google, that would allow a smartphone to accept payments.

Smartphone payment systems are already common in some countries. Perhaps the most famous and recognizable is the Square system, which uses a small magnetic stripe reader that plugs into a phone’s headphone jack. In Europe, some companies offer card readers for more secure chip-based payment cards, but nothing exists yet for NFC.

“We’ve been working on it at Mastercard for a little while,” said James Anderson, the company’s senior vice president of emerging payments. “We’re interested in turning every phone into a payment acceptance device.”

But while many of the NFC chips used in smartphones support it, not every phone could necessarily be used to accept payments.

There are a number of requirements for NFC payments, such as the minimum and maximum distance over which the NFC transaction can take place and the speed with which it happens. Terminal manufacturers design for these requirements, but not all smartphones have necessarily been built with these in mind.

“There may be situations in which the two devices may technically be capable of exchanging data for payment, but may not be able to achieve the read ranges and/or response times required for them to be certified as contactless payment capable,” said Sam Shrauger, senior vice president of digital solutions at Visa.

For now, the industry is watching closely to see consumer reaction to Apple Pay. If it takes off, competitors such as Google would probably step up their offerings and the NFC payments field would become a lot more competitive. That might bring new features such as phone-to-phone payments into play.

Source from : (itnews.com)


Amazon.com continued to increase its sales last quarter but losses also mounted, to the growing consternation of investors.

Amazon’s overall sales were up 20 percent in the third quarter, amounting to $20.58 billion for the three months ended Sept. 30, the company said Thursday.

At the same time, however, Amazon’s losses reached $437 million, compared to a loss of $41 million in the same period last year.

That led investors to push its share price lower. Amazon’s shares were selling for $279.00 at the time of this report, down 11 percent from the close of regular trading.

One culprit for the higher-than expected loss was the Amazon Fire phone. It didn’t sell as well as expected and contributed heavily to a $170 million write-off the company had to make this quarter, CFO Thomas Szkutak revealed in a conference call.

A switch among students from buying to renting textbooks also contributed to the wider loss.

CEO Jeff Bezos has long argued that investing for growth is the right strategy for Amazon. The company also made a big acquisition this year, picking up the gaming site Twitch for almost $1 billion, and it has released new tablets at a rapid clip.

Szkutak reiterated the company’s strategy in the conference call. Amazon’s goal is to maximize its free cash flow over the long term rather than concente on profit margins, he said.

“We’ve been, for several years now, in an investment mode, because of the opportunities we’ve had in front of us,” Szkutak said. “We know we must be selective in which opportunities we pursue, but we’ve been encouraged by the opportunities we have.”

Just hours before it released its earnings report, Amazon updated its Kindle Voyage book reader, making it lighter and faster. The Kindle Fire tablet got an update recently, and the company released a new version of that product for kids. Its game studio launched a slew of updated titles for the Fire tablet, including “Til Morning’s Light,” “CreepStorm,” and “Tales From Deep Space.”

In a move to tap into the growing mobile payments industry, Amazon launched Local Register, a mobile app and credit card reader. It also ventured into three-dimensional printing with an online marketplace that initially has about 200 schematics for printing online objects. Along with testing drones for product delivery, they are signs that Amazon is investing to secure its position in the future. The question is how patient its investors will be while the losses pile up.

Amazon doesn’t break out figures for Amazon Web Services, though it claimed “usage growth” of close to 90 percent from a year earlier.

“The [AWS] team is doing a fantastic job, not only in serving customers, but also in launching many new features and services,” Szkutak said.

Typically, Amazon includes the AWS results in the category of U.S. supplemental revenue. That category generated $1.34 billion in the quarter, up 40 percent from the same quarter a year earlier. It includes revenue from advertising services and Amazon-branded credit cards, but AWS makes up the lion’s share.

One analyst asked if Amazon will continue to sell hardware, given the lackluster sales of the Fire phone and some other less-than-enthusiastically-received products.

“I can’t speculate what we’ll do going forward. But we just launched a number of new tablets and eReaders we are excited about, at some really great price points. We’re excited to have these offerings for customers,” Szkutak said.

Source from : (itnews.com)


Microsoft’s revenue leaped 25 percent in the first quarter but its profits dropped, pulled down by expenses tied to layoffs and he integration of Nokia’s phone business.

Revenue hit US$23.2 billion in the quarter ended Sept. 30, easily exceeding the $22 billion consensus expectation from analysts polled by Thomson Reuters.

Net income was $4.54 billion, or $0.54 per share. That exceeded analysts’ expectations by $0.05, but was a drop of 13 percent compared with last year’s first quarter.

Profit was hurt by $1.14 billion of integration and restructuring expenses, which pulled down earnings by $0.11 per share, resulting from the massive round of layoffs start started in June, and from the ongoing meshing of Nokia’s Devices and Services business, whose $7.2 billion acquisition closed in April.

Microsoft CFO Amy Hood characterized the results as “a strong start to the year,” saying Microsoft benefitted from continued “momentum” in sales of its cloud computing products, and from “meaningful progress” across its devices businesses.

“We are innovating faster, engaging more deeply across the industry, and putting our customers at the center of everything we do, all of which positions Microsoft for future growth,” CEO Satya Nadella said in a statement.

Microsoft, which splits its business into two main segments, grew its Devices and Consumer revenue by 47 percent to almost $11 billion, while its Commercial revenue rose 10 percent to $12.3 billion.

In the Devices and Consumer business, the company highlighted 25 percent sequential growth in subscribers for Office 365 Home and Personal from the fourth quarter, Surface Pro 3 sales of about $900 million and a doubling of Xbox console sales. Phone hardware revenue — the Nokia business — topped $2.6 billion.

In Commercial, Microsoft trumpeted that sales of its on premise server products, like SQL Server, System Center and Windows Server, grew 13 percent, while sales of cloud computing software and services, such as Office 365, Azure and Dynamics CRM Online, shot up 128 percent. Windows volume licensing grew 10 percent.

Microsoft announced in July its intention to lay off 18,000 employees, or 14 percent of its workforce, the largest reduction in its history. The company has already cut about 15,100 positions in two layoff rounds. It’s not clear when it will eliminate the other 2,900 jobs, but the plan is to do so before the current fiscal year ends in June.

Performance in the Devices and Consumer business, which Microsoft breaks down into four sub-segments, was unequal.

The Licensing sub-segment had a 9 percent revenue drop year-on-year, down almost $400 million. The biggest tumble was Windows Phone revenue, which cratered 46 percent because Microsoft sold fewer smartphones. Also down were sales of its traditional Office software suite, which fell 5 percent, cannibalized by sales of the subscription-based Office 365 Home and Personal. And Windows sales to hardware makers fell 2 percent.

The Computing and Gaming Hardware sub-segment picked up part of the slack with a 74 percent revenue bump, driven by Surface sales, which more than doubled thanks to interest in the latest Surface Pro 3. Xbox platform revenue also helped, increasing 58 percent.

The third sub-segment, Phone Hardware, generated $2.6 billion in revenue, as Microsoft sold 9.3 million Lumia smarphones and almost 43 million feature phones. The final sub-segment, known as Other, grew its revenue 16 percent, helped by Office 365 consumer sales, which were up by almost $90 million [m], Bing search ad revenue, and sales of Microsoft-owned video games.

Meanwhile, the performance in the Commercial business reflected massive but low-growth sales of traditional on-premises software, and eye-popping growth from cloud computing applications and services. On-premises licensing inched up 3 percent to almost $10 billion, with strong sales of servers and tools, as well as of Windows licenses to businesses, which were up 10 percent. Sales of conventional perpetual-license Office suites dropped 7 percent, hurt by Office 365.

On the other hand, sales of Azure infrastructure- and platform-as-a-service cloud products, and of Office 365 cloud suites, along with enterprise IT professional services, like consulting, grew 50 percent.

During a conference call to discuss the results, Nadella said he was happy not only with the business performance but with a changing corporate culture he described as “fast, innovative, partner friendly and customer-obsessed.”

He said demand for Microsoft’s cloud products has been growing fast among consumers and businesses, so Microsoft is expanding its data center capacity and constantly adding features to its Azure services and cloud applications.

“One major Azure service or feature is released every three days on average,” he said.

At the same time, sales of on-premise server software remain solid, partly because of Microsoft’s hybrid strategy to let them run applications on premise and in the cloud, according to Nadella.

The Windows franchise, which has been on a bumpy road due to the mixed reception for Windows 8, is benefitting from the decision this year to give away licenses to makers of small devices — those with screens 9 inches and lower, he said. This has led to sub-$200 Windows devices.

He was also bullish about Windows 10, which is in a public testing period now and should be launched by mid-2015.

“Windows 10 will deliver a single unified application development platform: one way to write a universal app across the entire family of Windows devices, and one store with a unified way for applications to be discovered, purchased and updated across all of these devices,” he said.

Nadella also called out Surface, highlighting the increased sales and improved “business economics” of the product, which Computerworld estimated had lost about $1.7 billion between its launch in 2012 and August this year.

In Gaming, Nadella highlighted the $2.5 billion acquisition of Minecraft-maker Mojang, expected to close next month, because it “extends our ecosystem and community across multiple platforms” and strengthens Microsoft’s roster of video games.

Source from : (itnews.com)

Follow

Get every new post delivered to your Inbox.

Join 121 other followers