AMF bowls for customers with video sharing over managed IP

AMF Bowling Centers Inc. is wrapping up the roll-out of a $2 million nationwide network equipment upgrade that supports Voice over IP (VoIP) and video streaming, and includes turning over all network management to Verizon Business for three more years. As part of a contract with Verizon, the company is now relying on Verizon for an IP network that supports VoIP, point of sale devices and credit card transactions, and web hosting. AMF has 300 bowling centers in 38 states with more than 9,000 employees. The IP network also supports a centralized video surveillance system that is now being launched, as well as a centralized energy management system being tested in several bowling centers.

The contract with Verizon, signed early this year, will cost AMF about $800,000 a year, in addition to the $2 million equipment cost for Adtran routers in each center and cabling installations, he said. Additionally, because bowling has become a multimedia experience for customers, video and audio streaming of music videos is piped to most of the bowling centers using the IP network, said Harsha Bellur, vice president of IT at AMF. "We have extreme sound and light shows over projection screens in most locations with music videos that play while people are bowling," Bellur said. AMF's annual network services cost has gone up slightly with the Verizon managed service, but the number of IP applications and network reliability have far exceeded what was previously available, Bellur said. "The ROI was on the wall, but we had to do this and it made a lot of sense to invest, even with the recession," Bellur said in an interview. With cable modems and DSL there was not consistent bandwidth, while demanding applications like video were not possible. Before hiring Verizon for the managed IP service, AMF was using Verizon to provide a site-to-site VPN service, which relied on cable modems and DSL, and required AMF to work with 36 different ISPs. One of the biggest advantages of using a managed service from a nationwide provider like Verizon is having Service Level Agreements (SLAs) to guarantee service, Bellur said. The SLAs have already come in handy, resulting in a credit from Verizon because VoIP service in Atlanta and Virginia Beach, Va., was knocked out recently more than 3.5 hours - a provision of the SLA - due to regional flooding, Bellur said. "Verizon has kept up with its SLAs and offered a financial remedy," he said.

Because of the recent flood-related outages, AMF is planning to provision at least one analog phone line in each center to provide an automatic failover for voice services. "It's back to the future with the analog failover," he said, noting that AMF is now testing existing analog lines that were not being used to see which are resilient enough for failover duty. "The voice outages were a challenge and we learned the hard way with the floods," he said. "It caused some heartburn and was not something we anticipated, but we have options." The managed services contract with Verizon has not led to layoffs in the 29-person IT staff, although Verizon is managing all circuits, routers and cloud computing services. While the Verizon VoIP quality is generally good, one downside is that voice service goes down whenever there is a data network outage. The added Verizon support has meant AMF can strengthen its end-user computer support desk, which now is staffed by seven of the 29 in IT, Bellur said. AT&T Inc. and regional service provider Paetec also bid. Bellur and others picked Verizon partly because of its nationwide network, he said.

The centralized energy management system for AMF is undergoing trial runs now, to test the IP network automatically turning on and off heating and air conditioning according to hours of each bowling center. While AMF centers are actively using the network to support video and audio, Bellur said his team is contemplating using video displays as digital signs that would show pricing and examples of products on sale, including food and alcohol. The video surveillance system is designed to prevent theft and is just being installed to use the IP network, Bellur said. In addition, training videos could be piped over the IP network, Bellur said. Potentially, self-service kiosks for ordering food are possible, and online posting of scores could take place, shared over the nationwide network. "Teams between two cities could host a tournament sharing tournament brackets," he said. "We're brainstorming, but it all comes down to costs."

A longer term conceptual project has been discussed to stream videos of bowlers or birthday parties being held at bowling centers to relatives in other cities.

Seven tips to migrate and manage Windows 7

Enterprise IT organizations eager to upgrade aging Windows XP and Vista systems to Microsoft's just released Windows 7 could make the process a whole lot smoother by investigating a handful of management technologies and processes aimed at greasing the skids of such a major software update. 7 tools to ease your Windows rollout  "At some point, Windows users will need to transition over to Windows 7 because XP will no longer be supported and Vista just didn't take off in terms of adoption," says Steve Brasen, principal analyst at Enterprise Management Associates (EMA). "The ability to manage and automate the processes around upgrading to Windows 7 will be critical for midsize and enterprise organizations." Here industry watchers share seven essential steps enterprise IT managers must take when considering a move to Microsoft's Win 7. 1. Test desktop durability. The research also shows that those considering an upgrade won't be able to take a direct path from XP to Windows 7 - which represents a few challenges. According to data from Forrester Research, even two-and-a-half years after the general availability of Windows Vista, Windows XP still runs 86% of all enterprise PCs powered by Windows.

For one, hardware could be lacking the necessary drivers, memory and other components. "Migrating XP to Win 7 will challenge many IT administrators because you can't upgrade directly. Unlike previous Windows operating systems, such as XP, Win 7 offers several editions, or options, enterprise IT departments must consider when planning to migrate to the latest software. Some are suggesting companies buy all new hardware or perform a complete refresh of the computer," explains Katherine Wattwood, vice president of product development for Persystent Software.   Persystent Suite offers customers features that test existing PCs for hard disk space and other components required in Win 7. The software can help IT managers determine which PCs could handle the updated operating systems and which would need to be swapped out or updated themselves. "Pre-migration planning and hardware compatibility testing would be critical to determine which PCs are Win 7-ready," Wattwood says. 2. Plan for licensing. Industry analysts say three should be considered by IT managers. Windows 7 Enterprise is the edition the companies have the right to deploy if they own a Windows license covered with Microsoft Software Assurance, the vendor's software maintenance program offered as an option with volume licensing. First, Windows 7 Professional - comparable to Vista Business - could be the least expensive option, according to Forrester Research, which points out this edition is available via OEM, retail or volume licensing.

This enterprise version offers additional features that global organizations might find useful, such as DirectAccess, which gives mobile users access to corporate accounts without a VPN or BranchCache, a feature Microsoft says decreases the time users at remote offices spend waiting to download files across the network. Forrester advises in a recent research report that companies take into consideration many factors when planning for Win 7 licenses. Windows 7 Ultimate, Forrester says, could be considered as more of a consumer option and isn't sold via volume licensing - but could be put to use as a media PC in a corporate environment. Existing licenses, software agreements and the upgrade path should be among the considerations. "Your historical approach to refreshing your desktops and laptops combined with the age of your infrastructure by the time you're ready to start your Windows 7 deployment will impact whether you should introduce it via a forklift or 'big bang' approach or via the natural rolling refresh cycle," the report reads. "Your license plans should not just be limited to your Windows upgrade strategy. Not only does hardware need to be tested to see if it can take Windows 7, but software applications must also be checked for compatibility with the new version. "There are a few aversions enterprise IT has to an upgrade right now," EMA's Brasen says. "One of them being there is still a big problem with proprietary applications and drivers that are just not compatible with Vista or Win 7. Until companies can reach a level of compatibility and applications are brought up to speed, a transition will be difficult." In fact, Brasen says he is "not aware of a systems management vendor that doesn't have an upgrade path for Windows 7. They know it's coming. There can be opportunities to take advantage of bundles that can drive down costs across your Microsoft investments." 3. Ensure application compatibility.

Even if their current subscribers aren't planning for it in the next few months, it is going to happen at some point." That's why enterprise IT organizations should be testing application compatibility now, and products from companies like Persystent and CA, among many others, offer application compatibility testing. Manually conducting such tests would be extremely time-consuming, analysts say. This sort of testing would point out the potential problems and foreseeable performance issues desktops could incur when running Windows 7. Applications from vendors run automatically, detect the problematic machines and applications, perform an inventory and report back to IT managers about the issues. Vendors argue by adding automation to this process, they reduce costs and time to deployment. For many companies, acquiring enterprise software to help with a migration of enterprise software might be cost-prohibitive. For instance, CA IT Client Manager uses policies to reduce hands-on labor. "Our software allows IT to set policies that allow a set of individuals to have certain applications on their systems, while another group would have a different policy applied to them," says Laural Gentry, senior principal product manager at CA. "Our product supports the decision-making, planning process by performing asset inventory, application and infrastructure compatibility tests in order to ensure the entire migration runs more smoothly." 4. Take advantage of automation.

But industry watchers argue that attempting to migrate or manage a Win 7 environment without automation technology will overwhelm IT staff and nearly guarantee problems with the implementation. "An automated system management platform, available from many vendors, could bundle up the image and send it out to many machines as part of an automated process," Brasen says. "Companies will experience a lot of pain upgrading to Windows 7 if they can't get an automated platform in place." For many larger organizations the automated features could likely already be a part of their client systems management products from the likes of LANDesk, CA, Persystent, Kace, BigFix and several others. But Microsoft has considered this and made available a free tool to address this situation. But for small to midsize organizations, automated deployment isn't a technology they already have in house. Microsoft Deployment Toolkit (MDT) 2010 is software optimized to support Windows 7 deployments and includes built-in capabilities to support customers migrating from Windows XP to Windows 7, the company says. The release of Windows 7 has companies considering another new technology: virtual desktops. MDT 2010 Beta 2 is currently available for free download here.  "Microsoft is offering compelling reasons for customers to migrate to Windows 7," says Benjamin Gray, senior analyst with Forrester Research. 5. Consider client virtualization.

The promise of ease of management and increased security that virtual desktop technology offers could drive customers to consider the technology when they have budget dollars for a PC refresh. But Microsoft isn't the only vendor touting virtualization as an option. For its part, Microsoft offers two products that take advantage of virtualization and could be considered a means to managing a migration to or ongoing deployment of Windows 7. Microsoft Application Virtualization, the company says, helps reduce downtime for customers by turning Windows applications into "centrally managing virtual services that are delivered to any licensed Windows desktop or laptop." And Microsoft Enterprise Desktop Virtualization allows desktop administrators to create, deliver and centrally manage a virtual Windows XP or 2000 environment (based on Microsoft Virtual PC 2007) and run legacy applications on Windows Vista desktops, the vendor says. VMware and Citrix also boast virtual desktop infrastructure and could provide viable alternatives to a full-blown Windows 7 migration, industry watchers say. "IT managers would be able to go with a virtualization solution as well. It would be as simple as setting one up and deploying it out to many," Brasen says. "Microsoft, VMware and Citrix would all have options for customers here." 6. Replace hardware.

If you are doing desktop virtualization, you can deploy your virtual container for the new desktop environment down to each one of the client endpoints. For some IT organizations, a migration plan could morph into a replacement plan. According to industry watchers, the economic recession had many IT decision-makers postponing hardware upgrades and equipment investments until a recovery was in sight. Outdated desktops and laptops could be easier to swap out than update and vendors working with Microsoft would equip new hardware with the most recent operating system. Now couple the Windows 7 availability with a need to refresh hardware, and some might just kill two birds with one stone - making the migration challenge a moot point. "Many organizations with aging infrastructure could do a massive PC refresh by mid-2010 and replace existing hardware with new desktops and laptops," Forrester's Gray says. If a customer buys a Windows 7 PC and maintains the pre-loaded optimization features, they could experience benefits such as faster reboot and shutdown times, which ultimately provide productivity improvements to end users, says Bob Dieterle, executive director of worldwide services at Lenovo. "When looking at our customer engagements, almost 40% of PCs under management are out of warrantee, and not really able to take advantage of the new features offered," he says. "Our customers would get optimized performance, battery life and even applications, which have been redesigned for Windows 7." 7. Prepare for patch management.

PC vendors have worked with Microsoft to deliver machines optimized to work with Windows 7, such as the new Windows 7 Lenovo Enhanced Experience. Any client system management plan must include patch management. Many of the vendors offering automated features in migration packages also would be able to deploy patches on a one-to-many basis for organizations adopting Windows 7," EMA's Brasen says. "IT managers want to get to the point of doing one download of the patch from the site and distributing it out internally, essentially a process that is much faster and much less intrusive on the client devices." Do you Tweet? Before migrating to a new operating system, enterprise IT managers must be aware of how the upgrade will impact existing patch management procedures and also ensure they have any new and necessary policies in place before the rollout. "Maintaining the environment would mandate proven patch management technologies. Follow Denise Dubie on Twitter here.

Cloud Engines updates Pogoplug media sharing device

On Friday, Cloud Engines introduced the second generation of its Pogoplug multimedia sharing device. The new version adds several new features. The Pogoplug is designed to plug into your home or small office network and let you access and share content of USB hard drives over the Internet using a standard Web browser.

First off, it now has four USB 2.0 ports instead of one so you can connect multiple USB hard drives or flash drives without the need for a USB hub. It works with H.264 video, as well as common photo types, but doesn't support DRM media. Along with that, there's now support for global search across multiple drives. (It still connects to your router using gigabit Ethernet.) Also new are improved transcoding and wider support for streaming movies on the Web or to an iPhone app; the ability to automatically sync photos, music, videos, and other content from apps such as iTunes and iPhoto; tighter integration with Facebook, Twitter, and MySpace; automatic organization of your music, photos, and videos; and an address book that remembers the e-mail addresses with which you've shared content for future sharing. (Many of the enhancements will be available to current Pogoplug owners as well.) Pogoplug supports OS X 10.4 and higher as well as Windows XP and Vista, and Linux; Safari, Firefox 3, IE 7, IE 8, and Chrome Web browsers; and hard drives formatted as NTFS, FAT32, Max OS Extended Journaled and non-Journaled (HFS+), and EXT-2/EXT-3. Although there are no specific bandwidth requirements listed, the company says that a typical DSL connection (with 512 Kbps upload speed) works fine. Cloud Engines expects to ship the new Pogoplug before the end of the year for $129, and is currently taking pre-orders.

RemoteSight turns an iSight into CCTV

Do you wish your iSight were more like the unblinking, ever-vigilant Eye of Sauron? RemoteSight can act as an integrated camera source for Ben Software's SecuritySpy, which aggregates video feeds from multiple cameras into a heads-up multi-video display. Then you might be interested in Ben Software's new RemoteSight, an application that turns an iSight camera into a CCTV-style security camera, accessible over a network via Web browser.

RemoteSight captures both audio and video from the host Mac's iSight camera (or any attached video input device), and streams it out through an integrated Web server; video is accompanied by a live timestamp. Any Web browser has the ability to connect to the Web server across an internal network, and Internet remote viewing should be possible if the nonstandard additional ports used by the Web server are opened on the router to allow this traffic. The Web server also provides an option to remotely view what is happening on the Mac's monitor as well. Administrative users can turn off monitoring feeds individually, and you can protect all connections to the Web server by username and password registration. RemoteSight costs $27, and a fully functional demo is available as well, so you can give it a try.

RemoteSight runs as a faceless application, with no indication in the Dock that it is operating; however, a menu-bar item appears that cannot be easily removed, and (where available) the iSight LED light is turned on to indicate that the camera is in use. System requirements call for OS X 10.4.11 or later, 512GB of RAM (I'll assume that's a typo and you only need 512MB), and a video input device, such as a built-in iSight camera or external FireWire or USB camera.

Ex-Ford engineer charged with trade secret theft

A former product engineer at Ford Motor Co. has been charged with stealing sensitive design documents from the auto maker worth millions of dollars. Yu, 47, was charged with theft of trade secrets, attempted theft of trade secrets and unauthorized access to protected computers. Xiang Dong Yu, of Beijing, also known as Mike Yu, was arrested Wednesday at Chicago's O'Hare International Airport upon his entry into the U.S. from China where he is working with a Ford rival.

Each of the theft-related charges carries a maximum of 10 years in prison and a fine of up to $250,000. Yu faces a maximum of five years and $250,000 in fine on the charge of accessing a protected computer. According to the indictment papers, Yu was employed at Ford between 1997 and 2007. In his role as a product engineer at Ford, Yu had access to trade secrets contained in Ford system design specification documents. The arrest was announced by Terrence Berg, U.S. attorney for the Eastern District of Michigan. The documents contained detailed information on performance requirements and associated testing processes for numerous major components in Ford vehicles. According to the indictment papers, Ford has spent "millions of dollars and decades on research, developing and testing" to create the requirements in the system design documents.

The documents are created and maintained by subject matter experts at Ford and are used by Ford design engineers when building new vehicles and by suppliers providing parts to the company. In June 2005, Yu is alleged to have traveled to China in an attempt to find a job in the automotive industry. Yu resumed his job search in August 2006 and was offered a job with electronic and automobile component manufacturer Foxconn, PCE Industry Inc. in November of that year. Before leaving on the trip, Yu is alleged to have downloaded several system design specification documents, including those unrelated to his work, onto an external hard drive which he took with him to China. A few days after Yu accepted the job in December 2006, he is alleged to have downloaded more than 4,000 Ford documents to a hard drive.

Later that same month Yu left to work at Foxconn, PCE's facility in Shenzhen, China, with the stolen Ford documents in his possession. The documents included information on Ford's engine and transmission mounting subsystem, electrical distribution system, front and rear side door structure, steering wheel assembly and instrument panel and console subsystem. Yu did not inform Ford about his new job until January 2007. Slightly more than a year later, Yu apparently attempted to use the stolen trade secrets when applying for a new job with an automotive company in China. It is not clear from the indictment papers how authorities learned about Yu's attempts to use the stolen information in his job search in China. When those efforts proved unsuccessful, he accepted another job offer at Beijing Automotive Co., which was described in court documents as a Ford rival. It is not apparent, for instance, whether the companies that Yu applied to for jobs, informed Ford.

It's also not clear whether any of companies where Yu worked used any of the information that Yu allegedly had stolen. The court papers also mention that Ford's security controls included "marking" sensitive documents. A call requesting comment from the U.S. Attorney's office for the Eastern District of Michigan was not immediately returned. Earlier this month, Hong Meng, a former research scientist at DuPont USA was indicted on charges related to the theft of trade secrets. The incident is similar to other trade-secret thefts involving users with privileged access to corporate systems and data. Meng is alleged to have downloaded sensitive trade secrets pertaining to DuPont's new, thin-computer display technology called "organic light emitting diode" or OLED. The company charged Meng with attempting to profit from the information by using it to commercialize OLED products in China in conjunction with Peking University in Beijing.

Companies should be implementing risk-based access controls to sensitive data where the focus should be on understanding what an individual's role is and then making sure that individual only has access to the specific information needed for the job. In 2007, Gary Min, another former scientist at DuPont admitted to stealing an estimated $400 million worth of proprietary company information . He is serving an 18-month sentence in federal prison . Brian Cleary, vice president of marketing at security vendor Aveksa said the incident is another reminder of why companies need to implement a "governance framework" for managing, monitoring and logging all access and activity involving sensitive data.

N.Y. AG in 'witch hunt' for Intel, says think tank

An advocacy group today slammed the New York Attorney General's office for filing a federal antitrust lawsuit against chip giant Intel Corp. Intel quickly received some vocal support from the Competitive Enterprise Institute , a Washington, D.C.-based think tank that advocates for free enterprise and a limited government. New York Attorney General Andrew Cuomo on Wednesday filed the suit which alleges that Intel threatened computer makers, made payoffs and engaged in a "worldwide, systematic campaign of illegal conduct." The suit's charges resemble those listed in a 2005 federal lawsuit filed against Intel by AMD that's expected to go to trial this coming spring. Calling New York's lawsuit a "witch hunt," the group noted in an e-mail statement that "few markets are as vibrant and innovative as the processor market." "Mr. Cuomo's suit is just the latest example of the New York Attorney General using his authority to make headlines at consumers' expense.

By objective measures, the performance of the processor market has been nothing short of spectacular," he added. This baseless attack against Intel will only delay innovation in the computer chip market," said Ryan Radia, associate director of Technology Studies at the Competitive Enterprise Institute, in a statement. "During the very period that Mr. Cuomo alleges Intel was engaged in 'anti-competitive' behavior, desktop computer processors more than doubled in performance per dollar every two years. The New York suit isn't the only set of antitrust charges leveled at Intel - just the latest. Intel is facing similar charges in Korea, Europe and Japan, as well as class-action cases and the lawsuit leveled by AMD. But Radia contends that the New York attorney general is simply off base in this suit. "Mr. Cuomo's suit rests on the fundamentally flawed assumption that Intel's high market share is indicative of market control," he added. "In fact, Intel and archrival AMD have been competing fiercely for over a decade, and both firms continue to invest billions of dollars each year in researching and developing faster, more efficient chips."

Lenovo founder shares slogans, tells tales of 1980s China

The chairman of Lenovo, the world's number four PC maker, shared Chinese revolution-spirited slogans and the unlikely story of his company's growth out of a government-managed economy in a motivational speech to Chinese small business owners on Friday. Lenovo faced tough odds even though its founders were from the Chinese Academy of Sciences, a Chinese state-controlled institute for national research projects. Liu Chuanzhi, one of 11 former government researchers who founded the predecessor to Lenovo in 1984, recalled how the company fought through trade barriers, high component prices and domination by foreign brands in a talk that highlighted how fast China's economy has grown in the last three decades. "In 1993 almost the whole market was foreign-branded computers," Liu said at the forum for small and medium businesses in Hangzhou, a scenic city in eastern China.

The academy gave the company founding capital of 200,000 yuan, or about US$30,000 today. Things only grew worse when the company was scammed out of two-thirds of the money, he said. "When we came out, we not only lacked funds but also had no idea what to do," Liu said. That sum, far from enough, would not have been enough to buy three computers in China in the 1980s, Liu said. Lenovo, formerly called Legend Group, was founded early in China's process of market economic reforms and had to work in a tightly regulated environment. The low quality of components such as hard drives in China also hindered the company, he said. China tried to protect domestic PC makers in the 1980s by charging a massive 200 percent tariff on foreign computers, said Liu. "The result of this protection was that foreign computers were very difficult to get into China and could only be smuggled, but China also could not make its own computers very well," he said.

Lenovo's first PC did not reach the market until 1990. Lenovo struggled with low margins even as it built market share against foreign brands like IBM and Compaq in the 1990s. Government regulation also continued to slow the industry's growth. Lenovo's global presence gained a huge boost when it bought IBM's PC unit in 2005. The company's sales in developed markets have since slumped in the global economic recession and it has restructured to bring its focus back to China and other emerging markets. Chinese residents had to register with the government to become Internet users even late in the decade, said Liu. Lenovo today is the top PC vendor in China. Liu also emphasized the importance of company culture, describing Lenovo's as an example. "Make the company's interests the top priority, seek truth in forging ahead, take the people as the base," Liu said. To succeed like Lenovo, companies must "love to battle, know how to battle, and conduct campaigns with order," Liu said, using language reminiscent of "The Art of War," an ancient Chinese book on military strategy by Sun Tzu.

Chinese president Hu Jintao has promoted the slogan "take the people as the base" as a part of socialist theory. Liu attended a Chinese military college in the 1960s and worked on a Chinese farm during the Cultural Revolution, a chaotic period when many graduates were sent to the countryside for re-education.

Riverbed looks to speed cloud applications, storage

Riverbed Technology has a plan to help companies accelerate access to applications and storage resources that are located in a cloud computing environment and delivered over the Internet to private data centers, distributed branch offices and mobile end users. 10 cloud computing companies to watch The vendor says it will deliver new products and capabilities in 2010, beginning with a software version of its flagship Steelhead WAN optimization controller. When enterprises consolidate and virtualize IT infrastructure and applications in their own data centers and deliver applications over the WAN to remote offices and employees, Riverbed's WAN optimization gear plays a role in speeding applications and data transfers. Dubbed "virtual Steelhead for the cloud," this software appliance can run on servers located at a public cloud computing facility.

But installing a traditional hardware appliance isn't an option in most public cloud environments. "As customers move from their private cloud environments into either public environments or hybrid public/private environments, we want our technology to move with them. Riverbed is hosting a cloud launch event Tuesday in New York City, where it plans to show how the virtual Steelhead appliance works, including a demonstration of how to install it on the Amazon Web Services platform. But there's a bit of a problem because you can't get a physical box into that public cloud," says Eric Wolford, senior vice president of marketing and business development at Riverbed. "Virtual Steelhead for the cloud enables us to move our acceleration technology into that cloud environment." What customers will wind up with is "a three-way type of acceleration, with a Steelhead box at the remote site, a Steelhead box in their data center in the private cloud, and now the virtual Steelhead in their public cloud," he says. In addition, Riverbed plans to preview new technology for the acceleration of cloud storage. But concerns about latency issues, the need to rewrite applications and the possibility of getting locked into a particular cloud provider's platform are http://www.networkworld.com/news/2009/101509-snw-storage-cloud-concerns.... hindering adoption.

Enterprises are interested in the potential cost savings and operational benefits of using cloud storage. Cloud storage raises tricky performance issues, Wolford says. Now Riverbed says it has addressed some of those protocol inefficiencies, specifically for iSCSI, and can boost performance enough to allow enterprises to move their storage assets to sites anywhere in the world - even thousands of miles away from associated computing resources, Wolford says. There are fundamental inefficiencies in block storage protocols that restrict enterprises from running these protocols over the WAN. "If you try to run a block protocol over the WAN, performance will grind to a halt," he says. It's similar to the way Riverbed's technology deals with application latency, by slashing the number of roundtrips over the WAN that a chatty protocol such as CIFS requires. Classic unstructured data workloads such as these lend themselves well to this technology, Wolford says.

Now Riverbed is tackling server-to-disk chattiness by cutting roundtrip block requests, Wolford says. "It's very correct to say the high-level pattern is just like we did with applications, making them feel LAN-like over the WAN. We will make storage protocols over the WAN feel SAN-like." Initially Riverbed plans to focus its cloud storage technology on unstructured data, such as files, mail and Microsoft SharePoint storage. If Riverbed can solve the performance issues associated with cloud storage in the way the vendor did for applications and data transfers on the WAN, it will open up the entire market, according to Steve Duplessie, senior analyst at the Enterprise Strategy Group. "High latency alone will limit the types of applications that can find a home in the cloud. Forcing new interfaces or rewriting applications to take advantage of the cloud will be another deal breaker for a lot of folks," Duplessie said in a statement. "If I can think of the cloud the way I think about a disk drive today, the possibilities become truly endless."

Novell, SAP bring together security, compliance wares

Novell and SAP Tuesday announced a partnership to integrate, certify and support their respective security and identity technology and governance, risk and compliance software. In the next 30 days, that certification will extend to the other two applications in SAP's Business Objects governance, risk and compliance (GRC) suite - Process Control and Risk Management. "We cover the entire stack of GRC from applications to IT controls," says Ranga Bodla, senior director for governance, risk and compliance for SAP. The Novell software provides user provisioning, access control and security event monitoring, while the SAP tools address risk and access management, data monitoring and compliance management and reporting. Novell has integrated and certified its Novell Compliance Management Platform extension for SAP, Novell Identity Manager and Novell Sentinel to work with SAP's Business Objects GRC Access Control.

The two vendors hope the integration lets IT reduce costs and infrastructure by combining IT access controls and business process controls in a single integrated system. The companies also said they would optimize Novell's operating system for SAP's data center infrastructure. In essence, the two are creating a hub for defining security, identity and GRC across a network. "Users can synchronize across not only SAP applications but across all applications," says Jim Ebzery, senior vice president and general manager for identity and security at Novell. "So processes and policies in SAP Access Control can be mapped to another enterprise application with the same access controls tightly linked."The partnership between the two vendors http://www.networkworld.com/news/2008/031708-novell-modular-infrastructu... ">began early last year when Novell also announced a partnership to optimize SAP on SuSE Linux Enterprise and with Novell's virtualization and identity platforms. Follow John on Twitter: http://twitter.com/johnfontana

Online test helps you self-diagnose H1N1 flu

Feeling sick? Face it, your doctor may not be able to squeeze you right in. Wondering if it's the H1N1 flu or just a regular old go-away-don't-come-near-me, flu?

But you may be able to figure it out using a Web-based self-assessment tool developed by researchers at Emory University in Atlanta. Have you been short of breath? The tool is now available on several national Web sites, including flu.gov , the Centers for Disease Control and Prevention (CDC), and Microsoft's H1N1 Response Center . The online test includes questions like, do you have a fever? Do you have a pain or pressure in your chest that you didn't have before? The H1N1 flu , also widely known as the swine flu, is a fairly new influenza virus that has spread around the world.

Were you feeling better and now a fever or cough is returning? The CDC reports that it first appeared in the United States this past April. With concerns about the new flu running high , health care providers expect to get slammed with a mounting wave of people rushing in to find out if they have the H1N1 virus. By June 11, the World Health Organization categorized it as a pandemic . Because its extremely contagious, hospitals and health care workers have been bracing for the H1N1 to hit hard this fall. The online test, dubbed the Strategy for Off-Site Rapid Triage, is designed to help a lot of people figure out if they need to see their doctor or go to a hospital. "This Web site is carefully designed to encourage those who are severely ill, and those at increased risk for serious illness, to contact their doctor, while reassuring large numbers of people with a mild illness that it is safe to recover at home," Arthur Kellermann, professor of emergency medicine and an associate dean at the Emory School of Medicine, said in a statement. "Hopefully, providing easy-to-understand information to the public will reduce the number of people who are needlessly exposed to H1N1 influenza in crowded clinic and ER waiting rooms, and allow America's doctors and nurses to focus their attention on those who need us most."

Microsoft opens Outlook format, gives programs access to mail, calendar, contacts

Microsoft Monday said it will provide patent- and license-free use rights to the format behind its Outlook Personal Folders opening e-mail, calendar, contacts and other information to a host of applications such as antimalware or cloud-based services. The written documentation would explain how to parse the contents of the .pst file, which houses the e-mail, calendar and contact contents of Outlook Personal Folders. Five fantastic open source tools for Windows admins7 Reasons Not to Use Microsoft Outlook for Company E-mailDocumenting and publishing the .pst format could open up entirely new feature sets for programs such as search tools for mining mailboxes for relevant corporate data, new security tools that scan .pst data for malicious software, or e-discovery tools for meeting compliance regulations, according to Microsoft officials. The documentation will detail how the data is stored, along with guidance for accessing that data from other software applications.

This would allow the cloud service developers to write code on the server so someone could upload their .pst and have it read on the server rather than needing Outlook to be running on the client and somehow get the data that way." Microsoft plans to publish in the first half of next year documentation outlining the .pst format. The effort is designed to give programs the knowledge to read Outlook data stored on user desktops. "You could also imagine this being used for data portability possibly into the cloud," said Paul Lorimer, group manager for Microsoft Office interoperability. "A user might have data on a hard drive that they would like to migrate to a cloud service. The information will be released under Microsoft's Open Specification Promise (OSP), which began in 2006. That year, Microsoft dropped intellectual-property and patent claims to 35 Web services protocols it developed mostly for use in its identity infrastructure. In 2008, Microsoft added its Interoperability Principles and promised to support data portability in its most popular "high-volume products," including SQL Server 2008, Office 2007, Exchange 2007 and Office SharePoint Server 2007. Once the documentation of the .pst format is public, programmers can get into .pst files and read the contents without the need for Outlook. In 2008, Microsoft added the Office file formats to OPS even while critics said the formats were incomplete and the submission was designed to boost Office Open XML (OOXML) in the eyes of standards bodies. In fact, there will be no requirement for any Microsoft software.

Data in the .pst file is available to developers today via Microsoft's Messaging API (MAPI) and the Outlook Object Model, but Outlook needs to be installed on the desktop. Users are free to choose any platform, including Linux and any development language, such as Java or Ruby on Rails. Microsoft Monday was entertaining a number of customers and partners on its Redmond campus to help gather feedback on the documentation. Critics such as the Software Freedom Law Center have warned that inconsistencies are possible between Microsoft formats available under OPS and with the open source GPL license. The technical documentation will detail how the data is stored, along with guidance for accessing that data from other software applications.

Microsoft last year added language to OPS on patent/copyright coverage and information on how OSP interacts with GPL-based software development. Follow John on Twitter: twitter.com/johnfontana

UK customers get access to Google's PowerMeter

Google's PowerMeter service made its U.K. debut on Wednesday, allowing British Gas customers to monitor their electricity usage from afar online. AlertMe has linked its system with Google's PowerMeter, which can generate graphs of electricity usage that can be viewed anywhere using a mobile phone or computer. British Gas has partnered with AlertMe, a company based in Cambridge, England, that sells hardware and a service that sends electricity consumption data over the Internet. Many utilities are working on smart-meter technology to help consumers become more aware of their electricity usage and take steps to reduce it.

Google, which announced its PowerMeter service in February, estimates that consumers can cut their energy consumption by up to 15 percent if they have more information on their consumption patterns. The U.K. government is striving to have smart meters in every home by 2020, and the U.S. plans to install up to 40 million smart meters as part of a recent economic stimulus plan. Google's PowerMeter application can be placed on a person's iGoogle home page, a page people can customize with various Google gadgets. The meter reader takes second-by-second measurements, which are then transmitted wirelessly to AlertMe's "Nano Hub," which plugs into a router. To monitor electricity, AlertMe sells a device called a "meter reader." Since U.K. power meters can be read inductively, a homeowner just needs to clip the device onto the meter, said Pilgrim Beart, AlertMe's CEO. It avoids the difficulty of installing "smart meters," which are devices designed to replace old power meters for use in conjunction with remote monitoring systems. The information is then sent to AlertMe, which has its own Web-based front end where customers can log in and view their electricity data.

AlertMe also sells a "SmartPlug" that records the energy consumption for a specific device. Customers can also opt to view the information through Google's PowerMeter. Those cost £25 (US$41) each. AlertMe's service can break it down by device. Under the current configuration, however, Google's PowerMeter will only display the total household energy consumption rather than break it down by appliance.

The data can be viewed in kilowatt hours by day, week, month or year or as the total cost of the electricity. Users can also opt for a slightly cheaper payment arrangement, with a one-year £99 subscription that includes the meter reader, transmitter and hub. AlertMe sells the hardware for £69 and charges £2.99 per month for the online service. The option to use Google PowerMeter is free. It is testing with British Gas a system that would allow homeowners to control their programmers - thermostats in the U.S. - online, Beart said.

AlertMe is already working on more advanced features that will give consumers more control over power bills. AlertMe's labs are also working on software that would be able to distinguish major appliances from one another, such as identifying the refrigerator versus the washing machine versus a kettle. That improvement means people wouldn't need the £25 SmartPlug. All of those appliances have distinct electricity patterns that can be identified using algorithms, Beart said. Overall, the data can then be used to figure out consumption patterns.

AlertMe is entirely consumer-focused and wants to make it easy to "engage people effectively and change their behavior," Beart said. The delivery of the information is important. Although AlertMe's Web front-end enables more detail for its system than Google's PowerMeter, Beart said the Google option gives people another way to look at the data. The company, first:utility, is providing free smart meters to some 30,000 U.K. customers. Separately on Tuesday, Google announced it had reached its first agreement with a U.K. utility to use its PowerMeter.

Starting next month, those customers will able to use Google's PowerMeter in conjunction with those meters. For its PowerMeter program, Google said it has secured agreements with two device companies and 10 utilities in five countries, wrote Ka-Ping Yee, a software engineer, and Jens Redmer, of Google's business development, on a company blog.

Verizon CTO: 'We told you so' about FiOS

Verizon CTO Dick Lynch has a simple message to anyone who doubted his company's wisdom in building out a fiber-to-the-home (FTTH) network: We were right, you were wrong. Lynch put particular emphasis on chiding skeptical analysts and rival companies that tried to cast doubt upon Verizon's fiber plans. "In an attempt to maintain the status quo, our competitors did their best to create customer confusion around fiber-optic services," he said. "They claimed that their networks had been fiber for a decade, and they distributed misleading messages about the quality of FiOS. Their communications strategy was to create confusion and apathy and some people fell for it." Slideshow: Ma Bell's 25-year oddysey   Specifically, Lynch singled out a "potential customer" that told The Washington Post a few years back that "there's nothing on the Internet that requires that kind of bandwidth." Now, with the rise of YouTube, Facebook and other bandwidth-intensive Web applications, Lynch said that Verizon is having the last laugh. "With the exception of our competitors, everyone secretly hoped we would succeed," he said. "The industry experts would publicly say, 'Verizon is spending too much' or 'consumers don't need fiber.' But then they'd turn around and call us to find out how soon FiOS would be coming to their neighborhood." Verizon's FiOS services offer customers peak download speeds of 50Mbps and peak upload speeds of 20Mbps. Speaking at the FTTH Conference and Expo in Houston Tuesday, Lynch crowed about his company bringing FiOS Internet services to an estimated 3.1 million subscribers in the United States. Cable companies this year have begun ramping up their tests for faster services to compete with FiOS, as Comcast and Cablevision have started rolling out new Internet services based on the DOCSIS 3.0 standard that will offer businesses potential peak download speeds of 100Mbps.

Verizon has said in the past that it is trialing 100Mbps FiOS technology, although the company has given no timeline for when that technology might hit the market.

Spam, malware dominate online user comments, Websense reports

A staggering 95% of all "user-generated comments" for blogs, chat rooms and message boards online are spam or malicious, according to a new Websense report on security threat trends. "That's the first time we started monitoring that," says Patrick Runald, Websense senior manager for security research, about the level of spam and malware ploys carried out around blogs and chat rooms. In addition, 77% of Web sites with malicious code are said to be legitimate sites that have been compromised. "The bad guys are finding new ways for disseminating malware," Runald said. "It's getting worse." According to the Websense Security Labs report, based on data collected in part from scanning 40 million Web sites every hour, 61% of the Top 100 sites are said to either be hosting malicious content or containing a masked redirect to lure unsuspecting victims from legitimate sites to malicious ones. The Websense Security Labs "State of Internet Security Q1 – Q2 2009," which covers the period up to June of this year, also notes that the number of malicious Web sites for the period more than tripled.

Facebook, YouTube become malware magnets More than 47% of the Top 100 sites, particularly social-networking sites, such as Facebook or YouTube, support user-generated content, which the report notes is becoming a significant way to disseminate malware and conduct fraud. "On Facebook and other social-networking sites, there's an explicit sense of trust," says Runald. "That's why the bad guys are attempting to exploit it, with malware like Koobface, which could hijack your machine and send messages." In the area of cybercrime, one significant attack that took place involved criminals seizing control of the CheckFree Web site and attempting to re-direct users to a Web site hosted in Ukraine that tried to install malware on victims' computers. The report said CheckFree has more than 24 million customers and controls 70%-80% of the online bill-payment market.

Avaya wins Nortel enterprise business for $900 million

Avaya has emerged as the winning bidder for Nortel's enterprise business, reportedly beating out Siemens Enterprise Communications over the weekend. Avaya will also contribute an additional pool of $15 million for an employee retention program. The firm will pay $900 million for the unit, Nortel's Government Solutions group and DiamondWare Ltd., a Nortel-owned maker of softphones. That price is nearly twice what Avaya was initially said to be buying the enterprise business for back in July before auction bidding kicked in.

Telecom carrier Verizon, however, is expected to contest the sale on the grounds that Avaya does not plan to retain customer support contracts between Nortel and Verizon. Slideshow: The rise and fall of Nortel Avaya has sought Nortel's enterprise business in hopes of boosting its share of the enterprise telephony and unified communications markets, and getting more customers to migrate to its IP line of communications products.  The sale, expected to close later this year, is subject to court approvals in the U.S., Canada, France and Israel as well as regulatory approvals, other customary closing conditions and certain post-closing purchase price adjustments. Nortel is confident the sale will go through without any snags. "We do not expect the Verizon interaction to impact court approval or the close of this deal," said Joel Hackney, president of Nortel Enterprise Solutions. "We will continue to go forward in supporting customers." Hackney would not say whether Nortel is engaged in the negotiations between Avaya and Verizon on the future of certain customer support contracts, mentioning only that Nortel supports Verizon as a customer as well as the carrier's customers. Nortel customers hope the deal works out in their interest. "Nortel earned the trust of our user group members by delivering innovative, reliable communications solutions and ensuring high-levels of service and support, " said Victor Bohnert, Executive Director of the International Nortel Networks Users Association, in a prepared statement. "With the announcement of today's purchase by Avaya, we look forward to extending that relationship forward to serve the business communications needs of our constituency base across the globe." Nortel will seek Canadian and U.S. court approvals of the proposed sale agreement at a joint hearing on September 15, 2009. The sale close is expected late in the fourth quarter. Hackney also said there were two bidders for the enterprise unit but would not identify the second suitor. In some EMEA jurisdictions this transaction is subject to information and consultation with employee representatives.

As previously announced, Nortel does not expect that its common shareholders or the preferred shareholders of Nortel Networks Limited will receive any value from the creditor protection proceedings and expects that the proceedings will result in the cancellation of these equity interests.

Data breach hits payroll firm PayChoice

PayChoice of Moorestown, N.J., suffered an online breach that has apparently compromised its payroll-processing operations. 10 of the worst moments in network security history "PayChoice discovered a security breach in its online system on Wednesday, September 23, 2009," PayChoice CEO Robert Digby confirmed in a statement. "We are handling this incident with the highest level of attention as well as concern for our clients, software customers and the employees they serve." PayChoice today indicated it is working to provide additional information about the breach. Post writer Brian Krebs reports that the phishing attack was apparently aimed at getting into payroll and account data. According to the Washington Post, a number of PayChoice customers were subjected to an apparent phishing scam when they received an e-mail instructing them to download a Web browser plug-in in order to maintain access to PayChoice's online payroll service, onlineemployer.com.

PayChoice is said to license its online employee payroll management product to more than 240 other payroll processing firms to serve 125,000 organizations. He said licensees and payroll clients will be apprised on a daily basis, and he added they should taking "appropriate protective measures, including changing passwords and user id information." In his statement, Digby said that on Sept. 23rd, "we immediately shut down the online system and instituted fresh security measures to protect client information before starting it up again." The payroll-processing company has also engaged forensics experts to help determine the scope of the intrusion and notified federal law enforcement, according to Digby.

Microsoft's CodePlex Foundation leader soaks in stinging critique

After a stinging critique from a noted expert in establishing consortia, the leader of Microsoft's new CodePlex Foundation says such frank evaluation is welcome because the open source group's structure is a work in progress. The CodePlex Foundation's aim is to get open source and proprietary software companies working together. Sam Ramji, who is interim president of the CodePlex Foundation, was responding to last week's blog by Andy Updegrove, who said the group has a poorly crafted governance structure and looks like a sort of "alternative universe" of open source development. Updegrove, a lawyer, noted expert on standards, and founder of ConsortiumInfo.org, laid out in a blog post five things Microsoft must change if it wants CodePlex to succeed: create a board with no fewer than 11 members; allow companies to have no more than one representative on the Board of Directors or Board of Advisors; organize board seats by category; establish membership classes with rights to nominate and elect directors; and commit to an open membership policy.

He added, however, "There are some best practices [for running the boards of non-profits] that we are not as familiar with as we would want to be." Slideshow: Top 10 open source apps for Windows  Stephanie Davies Boesch, the foundation's secretary and treasurer, is the only board member with experience sitting on a non-profit's board. Despite the stinging tone in Updegrove's assessment, Ramji says he is thankful for the feedback. "Andy's been incredibly generous with his expertise and recommendations," Ramji says. "It is the kind of input and participation we were hoping to get by doing what is probably non-traditional for Microsoft but not necessarily non-traditional for non-profit foundations, which is to basically launch as a beta." For instance, Ramji says that the decision to go with only five people on the board came from Microsoft's experience that larger groups often have difficulty with decision making. Ramji says Updegrove's suggestion to have academic representation on the board was "outstanding. And basically it is re-writable. We did not think of that." And to Updegrove's point on becoming an open membership organization, Ramji says, "our goal is to become a membership organization and Andy has some excellent recommendations for that."He says the fact that Updegrove took the time to respond "in the format that he did is more proof that there is something worth doing here." Ramji, compares the Foundation's formation to the early days of a software development project. "We have said in these first 100 days we are looking at everything as a beta. Obviously, there are some areas like contributions and licensing agreements we put a lot of time into but even those can be modified." Microsoft announced the foundation Sept. 10 with a stated goal "to enable the exchange of code and understanding among software companies and open source communities." The company seeded the group with $1 million and Microsoft employees dominated the interim board of directors and board of advisors.

One is a call for a broad independent organization that can bridge cultural and licensing gaps in order to help commercial developers participate in open source. Ramji says the foundation has spent the past couple of weeks listening to feedback in "Twitter messages, email, and phone calls in order to understand what people hope this can be." Within that feedback two patterns have emerged, Ramji says. The other focuses on creating a place where open source .Net developers can gain strong backing. "Look at projects related to Mono, you also can look at NUnit, NHibernate, we really feel optimistic that the Foundation could help them gain a higher level of credibility in the open source community. Miguel de Icaza, the founder of the Mono project and the creator of the Gnome desktop, is a member of the Foundation's interim board of directors. They feel they have been lacking that strong moral support," Ramji says. From a high level, Ramji says the Foundation stands as a sort of enabler that helps independent developers, companies and developers working for those companies navigate the nuances and practices of open source development so they can either contribute source code to projects or open source their own technologies. "One suggestion has been that the Foundation should house all the best practices we have seen software companies and open source communities use," said Ramji. "We want to have a place where everyone interested in how to participate can come and read and if they choose they can use our license agreements or can use the legal structure of the Foundation to grant patent licenses and copyrights for developers and derivative works." Those licensing agreements have a distinct focus, Ramji said, on the rights that are related to code that is being contributed and on how to contribute the patent rights on that code.

Ramji says the goal is to service multiple projects, multiple technologies and multiple platforms rather than having one specific technology base, which is how most current open source foundations are structured. "It's early days and we have received a lot of good ideas from experts in a variety of fields from law to code to policy that is what we had hoped for," says Ramji. "Someone wrote it is nice to see Microsoft engaging early on without all the answers and to have the community solve what they would like to see. Once those issues are settled, code would be submitted using existing open source licenses. That is satisfying for me and refreshing to others. This is the right way to proceed." Follow John on Twitter

The other iPhone lie: VPN policy support

It turns out that Apple's iPhone 3.1 OS fix of a serious security issue - falsely reporting to Exchange servers that pre-3G S iPhones and iPod Touches had on-device encryption - wasn't the first such policy falsehood that Apple has quietly fixed in an OS upgrade. Before that update, the iPhone falsely reported its adherence to VPN policies, specifically those that confirm the device is not saving the VPN password (so users are forced to enter it manually). Until the iPhone 3.0 OS update, users could save VPN passwords on their Apple devices, yet the iPhone OS would report to the VPN server that the passwords were not being saved. It fixed a similar lie in its June iPhone OS 3.0 update. The fact of the iPhones' false reporting of their adherence to Exchange and VPN policies has caused some organizations to revoke or suspend plans for iPhone support, several readers who did not want their names or agencies identified told InfoWorld.

Worse, it revealed that Apple's iconic devices have been unknowingly violating such policies for more than a year. "My guess is the original decision to emulate hardware encryption was made at a level where there wasn't much awareness of enterprise IT standards. One reader at a large government agency describes the IT leader there as "being bitten by the change," after taking a risk to support the popular devices. "I guess we will all have to start distrusting Apple," said another reader at a different agency. [ Apple's snafu on the iPhone OS's policy adherence could kill the iPhone's chances of ever being trusted again by IT, argues InfoWorld's Galen Gruman. ] Last week's iPhone OS 3.1 update began correctly reporting the on-device encryption and VPN password-saving status when queried by Exchange and VPN policy servers, which made thousands of iPhones noncompliant with those policies and thus blocked from their networks. (Only the new iPhone 3G S has on-device encryption.) Apple's document on the iPhone OS 3.1 update's security changes neglected to mention this fix, catching users and IT administrators off-guard. After all, this is a foreign language for Apple," says Ezra Gottheil, an analyst at Technology Business Research. "However, once the company realized the problem, it made a spectacularly dumb choice. Instead, it allowed itself to be seen in the worst possible light. The change was necessary and inevitable, but Apple could have earned some points by coming clean at the earliest opportunity.

This is the result of a colossal clash of cultures. Even when it is trying, Apple cannot force itself to think like an enterprise vendor." Apple's advice to users on addressing the Exchange encryption policy issue is to either remove that policy requirement for iPhone users or replace users' devices with the iPhone 3G S. IT organizations can also consider using third-party mobile management tools that enforce security and compliance policies; several now support the iPhone to varying degrees, including those from Good Technology, MobileIron, and Zenprise.

Web server attacks, poor app patching make for lethal mix

A dangerous combination of a massive increase in Web server attacks and poor patching practices is a major cause of concern for experts, according to a report issued today by several security organizations. Hackers are after a foothold in the corporate network, to conduct client-side attacks against visitors of the site, but also once they have that foothold, to gain much higher privileges and use those to also steal data." Dhamankar pointed to the recent spread of malware from the New York Times Web site as a perfect example of the alarming increase in server attacks. In a groundbreaking study that matched attack trends with patching cycle data, some conclusions came as a shock, said Rohit Dhamankar, the director of security research at 3Com TippingPoint, which contributed real-world attack information - acquired from its intrusion detection systems - to the report. "The sheer number of attacks against Web servers was surprising," said Dhamankar. "In terms of attack volume, they were almost 60% of all so far this year. Over the weekend, hackers duped the newspaper into using a malicious ad, which in turn tricked users into downloading and installing fake antivirus software . "The New York Times is a respected brand, and so it's a perfect avenue to infect lots and lots of users," he noted.

The report - which can be read on the SANS Institute's Web site - correlated the high number of Web server attacks with another trend: poor patching practices by the Web's highest-profile third-party applications. "Applications that are widely installed are not being patched at the same speed as the operating system," said Wolfgang Kandek, the chief technology officer of Qualys, which contributed its patching data to the study. "For Adobe Reader, Adobe Flash, Sun Java, Microsoft Office, Apple QuickTime, the patch cycles are much much slower than for operating system," he added. Some servers, once compromised, are even attacking other servers to pillage back-end information and to host malware fed to unsuspecting users, said Dhamankar. That's a major problem. "From our point of view, this is a big deal, said Kandek, speaking for security professionals in general. "There are real-life examples, where you can see attackers attacking corporate Web servers, then from there infecting client machines, until eventually a client machine is compromised that has full access to the network. The combination of hacked servers and unpatched client applications is critical. "The lack of patching opens up a huge window of vulnerabilities," Kandek acknowledged. "It shows that patching is crucial." Adding salt to the wound, said Dhamankar and Kandek, is that while users are patching, they're patching the wrong software. Then [attackers] are stealing that corporation's data." "Attackers have realized that patching of these third-party apps is complex," added Dhamankar. "They know that a lot of people are focused on patching operating systems rather than patching applications like Flash or Reader." And thus they dig into the most widely-installed applications, looking for flaws.

While operating systems, particularly Windows, are patched by users and organizations at a relatively rapid - and complete - clip, the number of attacks exploiting OSes has dropped precipitously. "Enterprises are focused on OS patching rather than on application patching," said Dhamankar. "They don't have their resources allocated properly." Putting a stop to the threat trend won't be easy, but it is possible, argued Kandek. "Some enterprises have patching policies in place for third-party applications, and there are industry-standard tools to do this," he said. "The technical solutions are out there. [Third-party] patching could be much better, and I see vendors being pressured to do more to integrate their patching into these tools. "But we've done this before," Kandek continued, referring to the security situation several years ago, when Windows was the main target of attackers. Microsoft beefed up its then-OS, Windows XP, dedicated itself to writing more secure code and pushed customers to update religiously. "That means we can do something about this, too," Kandek concluded.

Oracle breaks silence on Sun plans in ad

Oracle Corp. ended it silence Thursday on its post-merger plans for Sun Microsystems Inc.'s Unix systems in an advertisement aimed at Sun customers to keep them from leaving the Sparc and Solaris platforms. Ever since Oracle announced in April its plans to acquire Sun, its competitors - notably IBM and Hewlett-Packard Co. - have been relentlessly pursuing Sun's core customer base, its Sparc and Solaris users. Oracle's ad to "Sun customers," makes a number of promises that includes spending more "than Sun does now," on developing Sparc and Solaris, as well as boosting service and support by having "more than twice as many hardware specialists than Sun does now." Analysts see Oracle's ad as a defensive move that doesn't answer some of the big questions ahead of the $7.4 billion merger with Sun . In fact, there may be a lot of room for skepticism and parsing of Oracle's claims, despite their apparent black and white assertions.

Among the top hardware makers, Sun registered the biggest decline in server revenue in the second quarter, offering evidence that this protracted merger may be eroding Sun's value. Europe is allowing until mid-January to sort this out, which keeps the merger in limbo for another quarter. Oracle wanted the acquisition completed by now but the European Commission this month said it would delay its antitrust review because of "serious concerns" about its impact on the database market. Analysts point out that Oracle's plans to spend more "than Sun does now," may be a little hallow because Sun's spending on developing Sparc and Solaris is probably at a low. "The ad sounds convincing - but perhaps being a word nitpicker, the Sun does now' might not mean much if Sun has drastically cut back due to plummeting sales," Rich Partridge, an analyst at Ideas International Ltd., said in an e-mail. "I think someone at Oracle suddenly realized that Sun was bleeding so badly that what would be left when Oracle finally got control would be worth a small fraction of what they paid and no one would buy the hardware unit," Rob Enderle, an independent analyst, said in an e-mail. But Enderle said the ad's claims do not preclude Oracle from selling its hardware division, and says the company "will have to support the unit for a short time after taking control; during that short time they can easily outspend Sun's nearly non-existent budgets." Gordon Haff, an analyst at Illuminata Inc., said if it was Oracle's plan to start on day one of the merger to shop the Sparc processor around, "would they have put this ad out? Taken at face value, the ad seems to indicate that Oracle will keep Sun's hardware and microprocessor capability and not spin it off, as some analysts believe possible.

Probably not," he said. "Does it preclude Oracle from changing their mind? Indeed, Oracle's major competitive concern was indicated in the ad in a quote by Oracle CEO Larry Ellison: "IBM, we're looking forward to competing with you in the hardware business." No. Companies change their mind all the time." An erosion of Sun's customer also hurts Oracle, because a lot of Sun customers are also Oracle customers, and Oracle doesn't want its existing customer to go to IBM and move away from Oracle's platform, Haff said.

Microsoft Issues Emergency Patches for IE

Microsoft today took the unusual step of releasing out-of-band patches for severe security flaws in all versions of Internet Explorer, along with related holes in the Microsoft Active Template Library included with Visual Studio.

Microsoft generally only releases patches outside of its normal monthly cycle for the most dangerous security flaws. The IE risks involve "components and controls that have been developed using vulnerable versions of the Microsoft Active Template Library," according to Microsoft, and could allow an attacker to run commands or download malware on a vulnerable PC if you simply view a malicious Web page. Such drive-by-download attacks are a favorite among Internet attackers.

According to Microsoft, this MS09-034 patch "is rated Critical for Internet Explorer 5.01 and Internet Explorer 6 Service Pack 1, running on supported editions of Microsoft Windows 2000; Critical for Internet Explorer 6, Internet Explorer 7, and Internet Explorer 8 running on supported editions of Windows XP; Critical for Internet Explorer 7 and Internet Explorer 8 running on supported editions of Windows Vista; Moderate for Internet Explorer 6, Internet Explorer 7, and Internet Explorer 8 running on supported editions of Windows Server 2003; and Moderate for Internet Explorer 7 and Internet Explorer 8 running on supported editions of Windows Server 2008."

Translation: if you use any version of IE on Windows 2000, XP or Vista, get the fix asap by running Windows Update. IT folks who maintain Windows Server 2003 and 2008 boxes don't have to rush quite as quickly but will still want the fix.

The companion patch fixes holes in the Microsoft Active Template Library, part of Visual Basic, which can be used to create the vulnerable ActiveX controls that trigger the IE flaws fixed in the MS09-034 patch. According to Symantec, the ATL patch won't fix vulnerable controls that have already been created, but will avoid creating new vulnerable controls. For more information see the MS09-035 bulletin.

EMC distances rival NetApp

EMC  has scored another victory over storage rival NetApp by purchasing Data Domain, a merger which widens the technological gap between the companies in the fast-growing data de-duplication market.

NetApp desperately wanted Data Domain to bolster its largely unsuccessful de-duplication business, as evidenced by its $1.9 billion bid to purchase the company. But EMC proved too rich, and on Wednesday signed a definitive agreement to buy Data Domain for $2.1 billion. 

"This is a move that strengthens EMC and doesn't put them in any financial or competitive bind," notes Pund-IT analyst Charles King. "From a competitive standpoint, I think EMC won the day here."

De-duplication is expected to play a major role in the storage market because it lets companies reduce the amount of disk space they need in their data centers. With data volumes growing quickly, technologies that make storage more efficient will be of huge importance over the next few years, says Forrester analyst Andrew Reichman.

But Reichman believes EMC paid too much for Data Domain. De-duplication is important because it automates the process of reducing storage requirements, but it isn't the only technology that can make storage more efficient, he says. Thin provisioning, snapshots and clones, and denser drives can all help enterprises use disk space more efficiently, he says.

The $2.1 billion price tag for Data Domain could be "mitigated by significant market growth" in de-duplication, Reichman says. "But I think in a number of years we might look back on this deal and say the winner lost and the loser won," he says. "You could say NetApp is a loser in this buy they didn't spend that huge amount of money. That gives them flexibility.

NetApp actually comes out of the negotiations $57 million richer, courtesy of a merger agreement termination fee Data Domain was obligated to pay. NetApp CEO Dan Warmenhoven said the company could not justify "engaging in an increasingly expensive and dilutive bidding war," and that NetApp remains confident in its "already compelling strategic plan, market opportunities, and competitive strengths."

NetApp was smart to walk away from the bidding war, but may still attempt to acquire another de-dupe vendor, says Deni Connor, principal analyst with Storage Strategies Now.

"I think it was a wise decision for NetApp to step away from it," Connor says. "Re-bidding for Data Domain would have really hurt their cash flow. It'll be interesting to see, though, what both companies do, how EMC integrates the Data Domain products and also what NetApp does in order to get some extra de-duplication capability."

Even if NetApp's decision was the right one, the bidding war forced the company to expose its financial limitations relative to EMC.

"This puts NetApp in a curious position," King says. "I've seen some analysts say that the company with the deeper pockets won and that's true enough. But the bidding war has also exposed the amount of money that it took to make NetApp blink, in essence. From a strategic standpoint, that's not a great place for a vendor to be in. … They've laid their cards on the table and moving forward I think that would put them at a disadvantage."

EMC reportedly has more than $7 billion in cash reserves, compared to $2.7 billion for NetApp. EMC also has a sizable sales lead, with $871 million in external disk storage systems factory revenue in Q1 2009, compared to $373 million for NetApp, according to a June report by IDC. NetApp isn't even EMC's biggest rival in terms of storage revenue, as HP, IBM, Dell and Hitachi all earn more NetApp.

To one observer, the Data Domain bidding war made little sense for either potential buyer. Data Domain's de-duplication technology is robust, but not the only game in town, notes analyst Arun Taneja of the Taneja Group. Data Domain's technology is single-node, meaning it can only de-dupe one node at a time, whereas rivals such as FalconStor, Sepaton and Permabit offer the more expansive global de-duplication, Taneja says.

"I think there are way less expensive ways of getting really good technology," Taneja said earlier this week, when the outcome of the bidding war was not yet clear. "The price is excessive right now. I'm always in favor of a company getting fair value. This is beyond fair, this has gone into a degree of madness. And I don't understand that because there are other technologies that are extremely viable."

NetApp does offer de-duplication today, but Taneja says the offerings haven't caught on with customers. "Unquestionably, NetApp needs a data de-duplication product. They don't have [a successful] one of their own," he says.

NetApp has offered de-dupe with its VTL product, but "it has no visibility, it has no traction," Taneja said.

NetApp last year  boasted that it can de-dupe primary storage from third-party vendors such as EMC, Hitachi and HP, as part of its V-Series line of storage virtualization products. NetApp has de-dupe built into its Data Ontap operating system, but clearly wanted to adopt an appliance-based approach by purchasing Data Domain, Connor says.

Connor expects further consolidation in the data de-duplication market. In addition to NetApp, HP and Dell might be interested in picking up one of the various de-dupe vendors, such as FalconStor, Sepaton, CommVault or Quantum, she says.

"I think it will be interesting to watch what else happens in the de-duplication wars. I don't think it's over yet," Connor says.King believes the pickings are slim now that Data Domain is off the market. Despite the limitation noted by Taneja, King said Data Domain succeeded in building a product with great efficiency and price-performance. "There are some other good companies out there, but I don't believe there are any with the same stature as Data Domain," King says.

One more question is how EMC will integrate Data Domain into its own line of products. The acquisition could potentially be bad for customers, who would have preferred de-duplication offered by an independent vendor, suggests Juergen Urbanski, managing director with industry analyst firm TechAlpha.

"Storage efficiency (notably de-duplication) is the enemy of a business model predicated on pushing more disk capacity out the door year after year, which is why customers we spoke to would have preferred to see such a disruptive technology remain in the hands of an independent vendor," Urbanski writes in a blog. "By acquiring Data Domain, EMC controls the pace of innovation, possibly pushing out the time when Data Domain's technology becomes applicable to ever broader classes of workloads."

EMC has sometimes maintained acquired companies as separate product lines or business units, for example VMware, Reichman notes. It's too early to tell how far EMC will go in integrating Data Domain into its own product line, he says.

"That's the question," he says. "Do they leave it alone? Or will they take the software technology and merge it into their core offerings? Initially they will definitely want to leave it separate. You could argue they will get more benefit if it's more tightly integrated into their own products."

New Mexico Supercomputer Expectations Raise Doubts

A legislative advisory group is questioning whether a government-funded supercomputer can ever meet its goal of generating revenue and creating significant numbers of high-tech jobs in New Mexico.

A report released late last month by the New Mexico Legislative Finance Committee, which advises the state's legislature on fiscal issues, was skeptical about the future of the $36 million project, overseen by the New Mexico Computing Application Center (NMCAC).

The New Mexico Legislature created the center in 2007.

Hopes were high at first for the supercomputer, called Encanto, which includes Silicon Graphics Inc. technology and 14,500 Xeon processors from Intel Corp.

Early on, it was ranked third on the Top500 supercomputer list. It has dropped to 12th on the list, and it may fall further when the new rankings are released next week at the International Supercomputer Conference in Germany.

Now, according the advisory group's report, "NMCAC's ability to continue as a going concern is in question."

When launched, the center was expected to generate $59 million for the state over six years by leasing cycle time on the machine and by winning government grants. The report projects that the system will generate about $2 million this year.

Reaching the financial goal "is going to be a challenge," acknowledged Tom Bowles, the science adviser to New Mexico Gov. Bill Richardson and chairman of NMCAC. However, he believes the center's primary attraction to potential customers is its working relationship with the Los Alamos and Sandia National Laboratories.

"If it was just a computer, we wouldn't be any different from any other system in the country," he said.

This version of this story originally appeared in Computerworld 's print edition.