Windows 10: Fact vs. fiction

With Win10 slated to drop July 29, we give you the straight dope on support, upgrades, and the state of the bits

It’s a few days before Windows 10 is officially slated to drop, and still, confusion abounds. Worse, many fallacies regarding Microsoft’s plans around upgrades and support for Win10 remain in circulation, despite efforts to dispel them.

Here at InfoWorld, we’ve been tracking Windows 10’s progress very closely, reporting the evolving technical details with each successive build in our popular “Where Windows 10 stands right now” report. We’ve also kept a close eye on the details beyond the bits, reporting on the common misconceptions around Windows 10 licensing, upgrade paths, and updates. If you haven’t already read that article, you may want to give it a gander. Many of the fallacies we pointed out six weeks ago are still as fallacious today — and you’ll hear them repeated as fact by people who should know better.

Here, with Windows 10 nearing the finish line, we once again cut through the fictions to give you the true dirt — and one juicy conjecture — about Windows 10, in hopes of helping you make the right decisions regarding Microsoft’s latest Windows release when it officially lands July 29.

Conjecture: Windows Insiders already have the “final” version of Windows 10

Give or take a few last-minute patches, members of the Windows Insider program may already have what will be the final version of Win10. Build 10240, with applied patches, has all the hallmarks of a first final “general availability” version.

If you’re in the Insider program, either Fast or Slow ring, and your computer’s been connected to the Internet recently, you’ve already upgraded, automatically, to the Windows 10 that’s likely headed out on July 29. No, I can’t prove it. But all the tea leaves point in that direction. Don’t be surprised if Terry Myerson announces on July 29 that Insiders are already running the “real” Windows 10 — and have been running it for a couple of weeks. Everyone else can get a feel for the likely “final” Windows 10, build 10240, by checking out our ongoing Windows 10 beta coverage at “Where Windows stands right now.”

Fact: Windows 10 has a 10-year support cycle

Like Windows Vista, Win7, and Win8 before it, Windows 10 has a 10-year support cycle. In fact, we’re getting a few extra months for free: According to the Windows Lifecycle fact sheet, mainstream support ends Oct. 13, 2020, and extended support ends Oct. 14, 2025. Of course, if your sound card manufacturer, say, stops supporting Windows 10, you’re out of luck.

ALSO ON NETWORK WORLD: What if Windows went open source tomorrow?

I have no idea where Microsoft’s statement about covering Windows 10 “for the supported lifetime of the device” came from. It sounds like legalese that was used to waffle around the topic for seven frustrating months. Microsoft’s publication of the Lifecycle fact sheet shows that Windows 10 will be supported like any other version of Windows. (XP’s dates were a little different because of SP2.)

Fiction: The 10 years of support start from the day you buy or install Windows 10

There’s been absolutely nothing from Microsoft to support the claim that the Win10 support clock starts when you buy or install Windows 10, a claim that has been attributed to an industry analyst.

The new Windows 10 lifecycle and updating requirements look a lot like the old ones, except they’re accelerated a bit. In the past we had Service Packs, and people had a few months to get the Service Packs installed before they became a prerequisite for new patches. With Windows 8.1, we had the ill-fated Update 1: You had to install Update 1 before you could get new patches, and you only had a month (later extended) to get Update 1 working. The new Windows 10 method — requiring customers to install upgrades/fixes/patches sequentially, in set intervals — looks a whole lot like the old Win 8.1 Update 1 approach, although corporate customers in the Long Term Servicing Branch can delay indefinitely.

Fact: You can clean install the (pirate) Windows 10 build 10240 ISO right now and use it without entering a product key

Although it isn’t clear how long you’ll be able to continue to use it, the Windows 10 build 10240 ISO can be installed and used without a product key. Presumably, at some point in the future you’ll be able to feed it a new key (from, say, MSDN), or buy one and use it retroactively.
Fiction: You can get a free upgrade to Windows 10 Pro from Win7 Home Basic/Premium, Win8.1 (“Home” or “Core”), or Win8.1 with Bing

A common misconception is that you can upgrade, for free, from Windows 7 Home Basic or Home Premium, Windows 8.1 (commonly called “Home” or “Core”), or Windows 8.1 with Bing, to Windows 10 Pro. Nope, sorry — all of those will upgrade to Windows 10 Home. To get to Windows 10 Pro, you would then have to pay for an upgrade, from Win10 Home to Pro.

Fact: No product key is required to upgrade a “genuine” copy of Win7 SP1 or Win8.1 Update
According to Microsoft, if you upgrade a “genuine” copy of Windows 7 SP1 or Windows 8.1 Update, come July 29 or later, Windows 10 won’t require a product key. Instead, keep Home and Pro versions separate — upgrade Home to Home, Pro to Pro. If you upgrade and perform a Reset (Start, Settings, Update & Security, Recovery, Reset this PC) you get a clean install of Windows 10 — again, per Microsoft. It’ll take a few months to be absolutely certain that a Reset performs an absolutely clean install, but at this point, it certainly looks that way.

Fiction: Windows 10 requires a Microsoft account to install, use, or manage

Another common misconception is that Microsoft requires users have a Microsoft account to install, use, or manage Windows 10. In fact, local accounts will work for any normal Windows 10 activity, although you need to provide a Microsoft account in the obvious places (for example, to get mail), with Cortana, and to sync Edge.

Fact: If your tablet runs Windows RT, you’re screwed

Microsoft has announced it will release a new version of Windows RT, called Windows RT 3, in September. If anybody’s expecting it to look anything like Windows 10, you’re sorely mistaken. If you bought the original Surface or Surface RT, you’re out of luck. Microsoft sold folks an obsolete bucket of bolts that, sad to say, deserves to die. Compare that with the Chromebook, which is still chugging along.

Fiction: Microsoft pulled Windows Media Player from Windows 10

One word here seems to be tripping up folks. What Microsoft has pulled is Windows Media Center, which is a horse of a completely different color. If you’re thinking of upgrading your Windows Media Center machine to Windows 10, you’re better off retiring it and buying something that actually works like a media center. WMP is still there, although I wonder why anybody would use it, with great free alternatives like VLC readily available.

Fiction: Windows 10 is a buggy mess
In my experience, Windows 10 build 10240 (and thus, presumably, the final version) is quite stable and reasonably fast, and it works very well. There are anomalies — taskbar icons disappear, some characters don’t show up, you can’t change the picture for the Lock Screen, lots of settings are undocumented — and entire waves of features aren’t built yet. But for day-to-day operation, Win10 works fine.

Fact: The current crop of “universal” apps is an electronic wasteland
Microsoft has built some outstanding universal apps on the WinRT foundation, including the Office trilogy, Edge, Cortana, and several lesser apps, such as the Mail/Calendar twins, Solitaire, OneNote, and the Store. But other software developers have, by and large, ignored the WinRT/universal shtick. You have to wonder why Microsoft itself wasn’t able to get a universal OneDrive or Skype app going in time for July 29. Even Rovio has given a pass on Angry Birds 2 for the universal platform. Some games are coming (such as Rise of the Tomb Raider), but don’t expect a big crop of apps for the universal side of Windows 10 (and, presumably, Windows 10 Mobile) any time soon.

Fiction: Microsoft wants to control us by forcing us to go to Windows 10
I hear variations on this theme all the time, and it’s tinfoil-hat hooey. Microsoft is shifting to a different way of making money with Windows. Along the way, it’s trying out a lot of moves to reinvigorate the aging cash cow. Total world domination isn’t one of the options. And, no, the company isn’t going to charge you rent for Windows 10, though it took seven months to say so, in writing.

Fiction: Windows 7 and Windows 8 machines will upgrade directly to Windows 10

Win7 and Win8 machines won’t quite upgrade directly to Win10. You need Windows 7 Service Pack 1, or Windows 8.1 Update 1, in order to perform the upgrade. If you don’t have Windows 7 SP1, Microsoft has official instructions that’ll get you there from Windows 7. If you’re still using Windows 8, follow these official instructions to get to Windows 8.1 Update. Technically, there’s a middle step on your way to Win10.

Fact: We have no idea what will happen when Microsoft releases a really bad patch for Windows 10

If there’s an Achilles’ heel in the grand Windows 10 scheme, it’s forced updates for Windows 10 Home users and Pro users not attached to update servers. As long as Microsoft rolls out good-enough-quality patches — as it’s done for the past three months — there’s little to fear. But if a real stinker ever gets pushed out, heaven only knows how, and how well, Microsoft will handle it.

Fact: You’d have to be stone-cold crazy to install Windows 10 on a production machine on July 29
There isn’t one, single killer app that you desperately need on July 29. Those in the know have mountains of questions, some of which won’t be answered until we see how Win10 really works and what Microsoft does to support it. If you want to play with Windows 10 on a test machine, knock yourself out. I will, too. But only a certified masochist would entrust a working PC to Windows 10, until it’s been pushed and shoved and taken round several blocks, multiple times.

You have until July 29, 2016, to take advantage of the free upgrade. There’s no rush. Microsoft won’t run out of bits.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

 

Windows revenue takes another bad beating

Third consecutive quarter of double-digit declines, but CEO is confident Windows 10 will ‘restore growth’

Microsoft on Tuesday said that Windows revenue again declined by double digits, the third straight such quarter, with sales of licenses to computer makers down 22% from the same period last year.

For the June quarter, Windows revenue from OEMs (original equipment manufacturers) was off $683 million relative to the same three-month span in 2014, making the decline for the fiscal year — Microsoft’s ended on June 30 — approximately $1.9 billion.

The bulk of Windows’ revenue comes from sales to OEMs, which pre-load the operating system on PCs, 2-in-1s, tablets and a few smartphone models. In the past, Microsoft has said 65% to 72% of Windows revenue stemmed from OEM sales.

Second quarter revenue from OEMs was down 27% for what Microsoft calls the “non-Pro” category and off 21% for the “Pro” class. The terms refer to the kind of Windows license, with non-Pro indicating the OS for consumer PCs and tablets, and Pro for devices targeting businesses. In Windows 10, for instance, the former will be Windows 10 Home while the latter will be Windows 10 Pro.

The declines of both non-Pro and Pro were both slightly larger than for the first quarter of 2015.
Microsoft blamed the consumer licensing downturn on slack in the sales channel as OEMs prepared devices for Windows 10, a 180-degree turn from the prior quarter, when it said the channel was still stuffed with PCs left over from the holidays.

“OEMs tightly managed PC inventory ahead of the Windows 10 launch, particularly in developed markets,” said CFO Amy Hood in prepared remarks during the front end of an earnings call with Wall Street Tuesday. “In our view, this is a healthy state for the channel as we head into a transformational launch that starts next week,” she added, referring to the July 29 debut of Windows 10.

Hood returned to the scaled-back OEM inventories when she responded to a question about whether Windows 10 would make up some of its second-quarter declines caused by the emptying retail sales channel. “Before every launch, we tend to have a tightening in the channel as they prepare and run reasonably lean,” Hood answered. “This is a healthy state. It’s within the range of normal.”

Meanwhile, Hood said Pro license revenue was still hamstrung by the tough comparisons in 2014 when sales of business PCs jumped as companies purged Windows XP — which was retired in April of that year — from their organizations. Microsoft has used XP as the whipping boy for the last several reporting periods, and gave the 2001 OS a few more licks Tuesday.

Also in play, although not stressed much by Microsoft, perhaps because it’s a broken record: The underlying problems of the PC industry, which continued a 14-quarters-and-counting contraction, and seems destined to be almost entirely a replacement market, with little signs of any meaningful growth down the line.

Both Hood and CEO Satya Nadella, who was also on the call, spun the Windows declines as less about the loss of revenue in the quarter just past and more about the opportunities ahead with Windows 10.

“With Windows 10, we expect momentum to build throughout the year, as we and our partners bring new devices, applications, and services to market,” said Hood. “We expect this to benefit our business results in the second half of the fiscal year.” Microsoft’s fiscal year runs from July to the following June, so Hood was referring to the first half of 2016.

Nadella pitched in as well. “Our aspiration with Windows 10 is to move people from meeting to choosing to loving Windows,” he said, repeating remarks he made earlier this year.

Not surprising — because it’s part of every CEO’s job description, no matter what industry or under what circumstances — Nadella was confident Windows 10 would turn around the company’s OS fortunes, if not in direct licensing revenue then in sales of after-market services and software, and advertising opportunities in its Bing search site.

“While the PC ecosystem has been under pressure recently, I do believe that Windows 10 will broaden our economic opportunity and return Windows to growth,” Nadella said. He touted the large number of devices and configurations in the testing process for Windows 10 certification, most of which won’t be available until later this year, as well as some revenue and gross margin growth possibilities from Microsoft’s own hardware, primarily the Surface Pro portfolio.

“Third, we will grow monetization opportunities across the commercial and consumer space,” Nadella pledged. “For consumers, Windows 10 creates monetization opportunities with store, search, and gaming.”

The three money-makers Nadella ticked off were the same ones Hood outlined to financial analysts in May, when she fleshed out the firm’s “Windows as a service” monetization strategy. Microsoft intends to shift revenue generation from its decades-long practice of licensing Windows to one more reliant on revenue from search ads within Bing results, gaming and apps sold through the Windows Store.

That strategy has led Microsoft to a number of radical decisions, including giving away Windows licenses to smartphone and small tablet makers — a move that hasn’t done much for the OS’s share in those categories — subsidizing Windows to makers of cut-rate notebooks, and most importantly for Windows 10, giving away upgrades to the new OS from Windows 7 and Windows 8.1.

The biggest contributor to that money-making strategy in the June quarter was clearly Bing. In a filing with the U.S. Securities and Exchange Commission (SEC), Microsoft said that Bing search advertising revenue had increased 21%, or $160 million, in the second quarter compared to the same period the year prior. Adding Cortana, Microsoft’s digital assistant and prognosticator, to Windows 10 was also part of the plan to increase Bing’s importance to the OS — which features strong links to the search engine in multiple components, including the new Edge browser — and use Windows 10 to drive the search service’s revenues.

While the growth in Bing ad revenue was less than a fourth of the decline in Windows revenue during the quarter, it was something.
Microsoft said nothing in the SEC filing about app revenue — perhaps because it remains minuscule — but it did boast of a $205 million increase, representing a 58% boost, from Xbox Live, its subscription-based multi-player network. Xbox Live is baked into Windows 10, and Microsoft has pinned significant revenue hope on the OS and Xbox Live reinvigorating the company’s PC gaming business, with the monetization angle coming from the ties between the two platforms — console and PC — and sales of and on the former since the service will be free on PCs and tablets running Windows 10.

“Gaming is an important scenario for Windows 10, and our success with Xbox this quarter gives us a strong starting position heading into launch,” said Nadella Tuesday.

And he remained glass-half-full. “We are confident that these are the right levers to revitalize Windows and restore growth,” Nadella said.

In general, Microsoft’s second quarter was a mess because of $8.4 billion in charges and layoffs in its phone division, resulting in the biggest-ever single-quarter loss and the first since 2012.

Microsoft took a $3.2 billion net loss for the quarter, compared to a $4.6 billion net profit for the second quarter of 2014, a $7.8 billion flip.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Biggest tech industry layoffs of 2015, so far

Microsoft, BlackBerry, NetApp among those trimming workforces

While the United States unemployment rate has hit a post-recession low, the network and computing industry has not been without significant layoffs so far in 2015.

Some companies’ workforce reductions are tricky to calculate, as layoff plans announced by the likes of HP in recent years have been spread across multiple years. But here’s a rundown of this year’s big layoffs either formally announced or widely reported on.

*Good Technology:
The secure mobile technology company, in prepping to go public in the near future, laid off more than 100 people late last year or early this year according to reports in January by Techcrunch and others. Privately-held Good, which employs more than 1,100 people according to its listing on LinkedIn, doesn’t comment on such actions. Though the company did say in an amended IPO filing in March that it would need to slash jobs this fiscal year if certain funding doesn’t come through. Good also showed improved financials, in terms of growing revenue and reduced losses, in that filing. Meanwhile, the company continues its business momentum with deals such as an extended global reseller agreement announced with Samsung Electronics America in June.

*Sony:
Reuters and others reported in January that Sony would be cutting around 1,000 jobs as a result of its smartphone division’s struggles. The Wall Street Journal in March wrote that Sony was clipping 2,000 of its 7,000 mobile unit workers as it attempts to eke out a profit and refocus, possibly on software, to fare better vs. Apple and other market leaders. Sony’s mobile business, despite solid reviews for its Xperia line of handsets, is nearly nonexistent in big markets such as the United States and China, according to the WSJ report. Still, the company’s president says Sony will never exit the market.

*Citrix:
The company’s 900 job cuts, announced in January along with a restructuring and improved revenue, were described by one analyst as “defensive layoffs” made in view of some disconcerting macro economic indicators, such as lower oil prices and a strengthening dollar. The virtu company said its restructuring, including layoffs of 700 full-time employees and 200 contractors, would save it $90 million to $100 million per year as it battles VMware, Microsoft and others in the virtualization and cloud markets.

*NetApp:
The company announced in May, while revealing disappointing financial results, that it would be laying off 500 people, or about 4% of its workforce. It’s the third straight year that the storage company has had workforce reductions, and industry watchers are increasingly down on NetApp . The company has been expanding its cloud offerings but has also been challenged by customers’ moves to the cloud and the emergence of new hyperconvergence players attacking its turf.

*Microsoft:
In scaling down its mobile phone activities, Microsoft is writing off the whole value of the former Nokia smartphone business it bought last year and laying off up to 7,800 people from that unit. Microsoft also announced 18,000 job cuts last year, including many from the Nokia buyout. Despite an apparent departure from the phone business, CEO Satya Nadella said Microsoft remains committed to Windows Phone products and working with partners.

*BlackBerry:
The beleaguered smartphone maker acknowledged in May it was cutting an unspecified number of staff in its devices unit in an effort to return to profitability and focus in new areas, such as the Internet of Things (it did eke out a quarterly profit earlier this year, though is still on pace to register a loss for the year). The Waterloo, Ontario outfit said in a statement that it had decided to unite its device software, hardware and applications business, “impacting a number of employees around the world.” Then in July BlackBerry again said it was making job cuts, and again didn’t specify the number.

*Qualcomm:
The wireless chipmaker is the latest whose name is attached to layoff speculation, and official word on this could come as soon as this week, given the company is announcing its quarterly results. The San Diego Union-Tribune reports that “deep cost cuts” could be in the offing, including thousands of layoffs, possibly equaling 10% of the staff. The company was commenting ahead of its earnings conference call on July 22. Qualcomm has been a high flyer in recent years as a result of the smartphone boom, but regulatory issues in China, market share gains by Apple and being snubbed by Samsung in its latest flagship phone have all hurt Qualcomm of late, the Union-Tribune reports.

*Lexmark: The printer and printer services company this month announced plans for 500 layoffs as part of a restructuring related to a couple of recent acquisitions. The $3.7 billion Kentucky-based company employs more than 12,000 people worldwide.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

The top 10 supercomputers in the world, 20 years ago

In 1995, the top-grossing film in the U.S. was Batman Forever. (Val Kilmer as Batman, Jim Carrey as the Riddler, Tommy Lee Jones as Two-Face. Yeah.) The L.A. Rams were moving back to St. Louis, and Michael Jordan was moving back to the Bulls. Violence was rife in the Balkans. The O.J. trial happened.

It was a very different time, to be sure. But all that was nothing compared to how different the world of supercomputing was.

The Top500 list from June 1995 shows just how far the possibilities of silicon have come in the past 20 years. Performance figures are listed in gigaflops, rather than the teraflops of today, meaning that, for example, the 10th-place entrant in this week’s newly released list is more than 84,513 times faster than its two-decades-ago equivalent.

#10: 1995 – Cray T3D-MC512-8, Pittsburgh Supercomputing Center, 50.8 GFLOP/S
The Pittsburgh Supercomputing Center is still an active facility, though none of its three named systems – Sherlock, Blacklight and Anton – appear on the latest Top500 list. The last time it was there was 2006, with a machine dubbed Big Ben placing 256th. (The PSC’s AlphaServer SC45 took second place in 2001 with a speed of 7,266 gigaflops.)

#9: 1995 – Cray T3D-MC512-8, Los Alamos National Laboratory, 50.8 GFLOP/S
Yes, it’s the same machine twice, which demonstrates that supercomputers were less likely to be bespoke systems filling giant rooms of their own, and more likely to be something you just bought from Cray or Intel. JUQUEEN is more than 98,600 times as powerful as the old T3D-MC512-8, a 512-core device that appears to have been more or less contained to a pair of big cabinets.

#8: 1995 – Thinking Machines CM-5/896, Minnesota Supercomputer Center, 52.3 GFLOP/S
Thinking Machines was an early supercomputer manufacturer, based in the Boston area, that had actually gone bankrupt already by the time the June 1995 Top500 list was published – Sun Microsystems would eventually acquire most of its assets in a 1996 buyout deal. The University of Minnesota’s HPC department is now the Minnesota Supercomputing Institute, whose new Mesabi system placed 141st on the latest list at 4.74 teraflops.

#7: 1995 – Fujitsu VPP500/42, Japan Atomic Energy Research Institute, 54.5 GFLOP/S
Fujitsu’s been a fixture on the Top500 since the list was first published in 1993, and 1995 was no exception, with the company picking up three of the top 10 spots. The Japan Atomic Energy Research Institute has dropped off the list since 2008, though it may be set to return soon, with the recent announcement that it had agreed to purchase a Silicon Graphics ICE X system with a theoretical top speed of 2.4 petaflops – which would place it just outside the top 25 on the latest list.

#6: 1995 – Thinking Machines CM-5/1056, Los Alamos National Laboratory, 59.7 GFLOP/S
For the record, we’re well over the 100,000x performance disparity between these two systems at this point. One thing that’s notable about 1995’s systems compared to today’s is the small number of cores – the CM-5 that placed sixth in 1995 used 1,056 cores, and the Fujitsu behind it used only 42. Per-core performance is still orders of magnitude higher today, but it’s worth noting that a huge proportion of the total performance increase is due to the vastly higher number of processor cores in use – no system on the 2015 list had fewer than 189,792, counting accelerators.

#5: 1995 – Fujitsu VPP500/80, Japan National Laboratory for High Energy Physics, 98.9 GFLOP/S
The power factor is back down to about 87,000 with the substantial jump in performance up to the 80-core Fujitsu’s nearly 100 gigaflop mark. The VPP500/80 would remain on the list through 1999, never dropping below the 90th position.

#4: 1995 – Cray T3D MC1024-8, undisclosed U.S. government facility, 100.5 GFLOP/S
The T3D MC1024-8 system used at an undisclosed government facility (which is almost certainly not the NSA, of course) was the first on the 1995 list to top the 100 gigaflop mark, and stayed on the Top500 until 2001. That’s a solid run, and one that the Fujitsu K computer, on its fourth year in the top 5, could do well to emulate.

#3: 1995 – Intel XP/S-MP 150, Oak Ridge National Laboratory, 127.1 GFLOP/S
The Department of Energy’s strong presence on the upper rungs of the Top500 list is one thing that hasn’t changed in 20 years, it seems – four of the top 10 in both 2015 and 1995 were administered by the DOE. The XP/S-MP 150 system boasts roughly three times as many processor cores than all but one other entry on the list, at 3,072, in a sign of things to come.
supercomputers

#2: 1995 – Intel XP/S140, Sandia National Laboratory, 143.4 GFLOP/S
Indeed, the other Intel system on the 1995 list was the only other one with more cores, at 3,608. It’s even starting to look more like a modern supercomputer.

#1: 1995 – Fujitsu Numerical Wind Tunnel, National Aerospace Laboratory of Japan, 170 GFLOP/S
The Numerical Wind Tunnel, as the name suggests, was used for fluid dynamics simulations in aerospace research, most notably the classic wind tunnel testing to measure stability and various forces acting on an airframe at speed. The 2015 winner, China’s Tianhe-2, is almost two hundred thousand times as powerful, however.


 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Windows 10 to run rings around customers

Microsoft talks up release cadence rings within the consumer-oriented Current Branch; promises at least one fast, one slower

Microsoft’s top operating system executive today confirmed that the two main Windows 10 update and upgrade “branches” will offer customers multiple “rings,” or tempos, that they can select to receive changes quickly or after they’ve been tested by others.

“We won’t be updating every Windows consumer device on the second Tuesday of the month,” said Terry Myerson, who leads the Windows and Device Group. “We’re going to let consumers opt into what we’re calling ‘rings.’ Some consumers just want to go first. And we have consumers that say, ‘I’m okay not being first.'”

Myerson spoke during the Monday keynote that opened Microsoft’s Worldwide Partner Conference (WPC) in Orlando, Fla.

Customers who want to opt in to a “fast” ring on the Current Branch — the Windows update track geared towards consumers running Windows 10 Home — will receive updates first, while those who adopt the “slow” ring will get slightly more stable and reliable code later. There may be other rings, but those were the two that Myerson mentioned.

The fast-slow ring approach debuted with the Windows Insider Program, the preview and testing deal that kicked off in October 2014.

While Myerson had said in May that the Current Branch for Business (CBB), the primary release track for Windows 10 Pro users, and one that Windows 10 Enterprise can also adopt, would feature rings he had not said the same about consumers’ CB. Computerworld and some analysts had assumed that the two tracks — CB and CBB — would each offer at least two rings when the new OS launched July 29.

“Once Windows 10 ships, rings won’t determine how many updates you get, but rather your place in the queue to get a new update,” explained Steve Kleynhans of Gartner in a recent interview. “As such, rings will be more about controlling the rate at which the updates flood out into market.”

Windows Insider participants have been placed on the slow ring by default, requiring users to reset an option to get on the faster cadence. It’s unknown whether the same slow-is-the-default setting will be used on the final edition’s CB and CBB tracks.

There are still unanswered questions about Windows 10’s update and upgrade pace, including the lag between fast and slow, but Microsoft has slowly been dribbling out details. There will be several tracks, including Insider — which will continue to serve the adventurous with previews — Current Branch, Current Branch for Business, and Long-term Service Branch (LTSB), a static channel that delivers only security patches and critical bug fixes. LTSB does not offer the feature and functionality, user interface (UI) and user experience (UX) changes the others will receive three times annually.

The plethora of branches and rings, and their staggered releases — which will result in a 16-month active lifespan for any one build because of delayed deployment options for CBB users — has raised questions about fragmentation that could affect developers and support teams, or make management more complicated for corporate IT staffs.

Analysts, however, have largely discounted such concerns, saying that while Windows 10 will create some fragmentation, ultimately it will create a more uniform ecosystem than the current Windows scene.

“For customers and developers, it won’t be too different than targeting all the Windows versions and service packs that they have to today,” agreed Gary Chen, an analyst at IDC. “”There are really only four rings that matter, [the two each in] CB and CBB, and a business may only be concerned about CBB, so that’s effectively two rings to manage, not a big change from what they support today.”

Today, Myerson again denigrated what he dubbed “selective patching” to make a less-than-subtle pitch for adoption of CCB served by the new Windows Update for Business (WUB) service. “This introduces costs, complexities and delays,” Myerson said of selective patching and updating. “In today’s threat environment, that’s a problem.” WUB will deliver all update changes, eliminating the pick-a-patch practice used by many IT administrators for decades. (Shops on CBB may also use the traditional WSUS — Windows Server Update Services — to selectively deploy updates.)

Myerson also reiterated the strategy of Windows 10, which Microsoft has characterized as “Windows as a service,” by emphasizing the continual updates and upgrades that will reach customers. “We’re committed to continuous upgrades of [the] Windows device base,” he said.

While Myerson also used the phrase the supported lifetime of the device today in talking about updates, he did not define it. That phrase has been dissected since its first use in January because it will restrict the time that free updates and upgrades will be offered to Windows 10. Late last month, the Redmond, Wash. company said that device lives would range from two to four years.

In that disclosure — a footnote on a presentation outlining how Microsoft will defer some revenue from Windows 10 — Microsoft said the device lifetime would be calculated on “customer type,” hinting that it would separate consumer and business device owners, probably by sniffing out the edition of Windows 10 running on the device.

What still remains unclear is which devices will receive feature/functionality and UI/UX updates and upgrades for the minimum of two years, which get the maximum of four, and which are part of an in-between span.

 

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

 

Why the open source business model is a failure

Most open source companies can’t thrive by selling maintenance and support subscriptions. But the cloud may be the key to revenue generation.

Open source software companies must move to the cloud and add proprietary code to their products to succeed. The current business model is recipe for failure.

That’s the conclusion of Peter Levine, a partner at Andreessen Horowitz, the Silicon Valley venture capital firm that backed Facebook, Skype, Twitter and Box as startups. Levine is also former CEO of XenSource, a company that commercialized products based on the open source Xen hypervisor.
INSIDER: 4 open-source monitoring tools that deserve a look

Levine says the conventional open source business model is flawed: Open source companies that charge for maintenance, support, warranties and indemnities for an application or operating system that is available for free simply can’t generate enough revenue.

“That means open source companies have a problem investing in innovation, making them dependent on the open source community to come up with innovations,” he says.

Why is that a problem? After all, the community-based open source development model has proved itself to be more than capable of coming up with innovative and very useful pieces of software.
Revenue limits

The answer is that without adequate funding, open source businesses can’t differentiate their products significantly from the open source code their products are based on, Levine maintains. Because of that there’s less incentive for potential customers to pay for their products rather than continue using the underlying code for nothing. At the very least it limits the amount that open source businesses can hope to charge – putting a cap on their potential revenues. It’s a vicious circle.

“If we look at Red Hat’s market, 50 percent of potential customers may use Fedora (the free Linux distribution,) and 50 percent use Red Hat Enterprise Linux (the version which is supported and maintained by Red Hat on a subscription basis.) So a large part of the potential market is carved off – why should people pay the ‘Red Hat tax’?” Levine asks.

You could argue that this is actually good for businesses, because the availability of open source software at no cost provides competition to open source companies’ offerings based on the same code, ensuring that these offerings are available at a very reasonable price.

But if open source businesses can’t monetize their products effectively enough to invest in innovation, then potential corporate clients can’t benefit from the fruits of that innovation, and that’s not so good for customers.
Uneven playing field

The problem is compounded when you consider that open source companies’ products are not just competing with the freely available software on which their products are built. It’s often the case that they also have to compete with similar products sold by proprietary software companies. And that particular playing field is often an uneven one, because the low revenues that open source companies can generate from subscriptions mean that they can’t match the huge sales and marketing budgets of competitors with proprietary product offerings.

It’s an important point because although sales and marketing activities are costly, they’re also effective. If they weren’t, companies wouldn’t waste money on them.

So it follows that open source companies miss out on sales even when they have a superior offering, because having the best product isn’t enough. It’s also necessary to convince customers to buy it, through clever marketing and persuasive sales efforts.

The problem, summed up by Tony Wasserman, a professor of software management practice at Carnegie Mellon University, is that when you’re looking to acquire new software, “open source companies won’t take you out to play golf.”

The result, says Levine, is that open source companies simply can’t compete with proprietary vendors on equal terms. “If you look at Red Hat, MySQL, KVM … in every case where there’s a proprietary vendor competing, they have more business traction and much more revenue than their open source counterparts.”

As an illustration of the scale of the problem, Red Hat is generally held up as the poster child of open source companies. It offers an operating system and a server virtualization system, yet its total revenues are about a third of specialist virtualization vendor VMware, and about 1/40th of Microsoft’s.
Hybrid future

This is why Levine has concluded that the way for open source companies to make money out of open source software is to abandon the standard open source business model of selling support and maintenance subscriptions, and instead to use open source software as a platform on which to build software as a service (SaaS) offerings.

“I can run a SaaS product by using Fedora as a base, but then building proprietary stuff on top and selling the service. So the monetization goes to the SaaS product, not to an open source product,” says Levine. “I think we’ll start to see an increasing number of SaaS offerings that are a hybrid of open source and proprietary software.”

[Related: Can LibreOffice successfully compete with Microsoft Office?]

He adds that many SaaS companies – including Salesforce, Digital Ocean and Github (two companies Andreessen Horowitz has invested in) – already use a mix of open source and proprietary software to build their services.

And Levine says that Facebook is the biggest open source software company of them all. “I was shocked when I realized this, and Google probably is the second biggest,” he says.

Facebook has developed and uses open source software for the infrastructure on which its social network is built, and adds its own proprietary software on top to produce a service it can monetize. Google also generates a large volume of open source infrastructure code, although its search and advertising software is proprietary, he adds.

While the existence of free-to-download software undoubtedly makes it harder for open source businesses to monetize the same software by adding support, maintenance and so on, it’s also the case that these low-cost alternatives must make life more difficult than otherwise for proprietary vendors trying to sell their products into the same market.

That’s because these low-cost alternatives necessarily make the market for proprietary software smaller even if proprietary companies have higher revenues that they can use to innovate, differentiate their products, and market them.

This could help explain why some proprietary software companies are moving their products to the cloud, or at least creating SaaS alternatives. A mature product like Microsoft’s Office suite can largely be functionally replicated by an open source alternative like LibreOffice, but Microsoft’s cloud-based Office 365 product takes the base Office functionality and adds extra services such as file storage, Active Directory integration and mobile apps on top.

That’s much harder for anyone to replicate, open source or not. And it suggests that in the future it will be all software companies, not just open source shops that move to the cloud to offer their software as a service.

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com

Attackers abuse legacy routing protocol to amplify distributed denial-of-service attacks

Servers could be haunted by a ghost from the 1980s, as hackers have started abusing an obsolete routing protocol to launch distributed denial-of-service attacks.

DDoS attacks observed in May by the research team at Akamai abused home and small business (SOHO) routers that still support Routing Information Protocol version 1 (RIPv1). This protocol is designed to allow routers on small networks to exchange information about routes.

RIPv1 was first introduced in 1988 and was retired as an Internet standard in 1996 due to multiple deficiencies, including lack of authentication. These were addressed in RIP version 2, which is still in use today.
6 simple tricks for protecting your passwords

In the DDoS attacks seen by Akamai, which peaked at 12.8 gigabits per second, the attackers used about 500 SOHO routers that are still configured for RIPv1 in order to reflect and amplify their malicious traffic.

DDoS reflection is a technique that can be used to hide the real source of the attack, while amplification allows the attackers to increase the amount of traffic they can generate.

RIP allows a router to ask other routers for information stored in their routing tables. The problem is that the source IP (Internet Protocol) address of such a request can be spoofed, so the responding routers can be tricked to send their information to an IP address chosen by attackers—like the IP address of an intended victim.

This is a reflection attack because the victim will receive unsolicited traffic from abused routers, not directly from systems controlled by the attackers.

But there’s another important aspect to this technique: A typical RIPv1 request is 24-byte in size, but if the responses generated by abused routers are larger than that, attackers can generate more traffic they could otherwise do with the bandwidth at their disposal.

In the attacks observed by Akamai, the abused routers responded with multiple 504-byte payloads—in some cases 10—for every 24-byte query, achieving a 13,000 percent amplification.

Other protocols can also be exploited for DDoS reflection and amplification if servers are not configured correctly, including DNS (Domain Name System), mDNS (multicast DNS), NTP (Network Time Protocol) and SNMP (Simple Network Management Protocol).

The Akamai team scanned the Internet and found 53,693 devices that could be used for DDoS reflection using the RIPv1 protocol. Most of them were home and small business routers.

The researchers were able to determine the device make and model for more than 20,000 of them, because they also had their Web-based management interfaces exposed to the Internet.

Around 19,000 were Netopia 3000 and 2000 series DSL routers distributed by ISPs, primarily from the U.S., to their customers. AT&T had the largest concentration of these devices on its network—around 10,000—followed by BellSouth and MegaPath, each with 4,000.

More than 4,000 of the RIPv1 devices found by Akamai were ZTE ZXV10 ADSL modems and a few hundred were TP-Link TD-8xxx series routers.

While all of these devices can be used for DDoS reflection, not all of them are suitable for amplification. Many respond to RIPv1 queries with a single route, but the researchers identified 24,212 devices that offered at least an 83 percent amplification rate.

To avoid falling victim to RIPv1-based attacks, server owners should use access control lists to restrict Internet traffic on UDP source port 520, the Akamai researchers said in their report. Meanwhile, the owners of RIPv1-enabled devices should switch to RIPv2, restrict the protocol’s use to the internal network only or, if neither of those options is viable, use access control lists to restrict RIPv1 traffic only to neighboring routers.


MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com