Trends are complicated things. Consider, for example, the threat often called Ransomware. In our annual Internet Security Threat Report, all signs were that it was following a steady growth path which would continue into 2014. However, more up to date intelligence (as documented in our May 2014 Intelligence Report) suggests otherwise. It remains to be seen if the threat is cyclic, so we shall continue to watch with interest
Solution providers report that the average customer is running 10 times as many cloud apps as its IT director thinks it is because employees are taking matters into their own hands when it comes to finding tools that enhance their productivity, whether they're sanctioned or not.
All sorts of evidence suggests most businesses, big or small, are considering whether or not to invest in mobile applications, either as an internal tool for employees or as an external service for improving customer or constituent engagement.
In June, law enforcement in the U.S. knocked out the command and control servers of the Gameover malware that compromised thousands of PCs all around the world. Gameover, developed by hackers using source code from the Zeus family, was particularly effective at spreading the Cryptolocker ransomware that infected so many people in the U.S. and UK last fall and held their files to ransom.
Thanks to the advent of cloud and mobility, video is now at the fingertips of the masses, and, according to Brian Kinne, CTO and vice president of digital media at Cleveland-based solution provider MCPc, it's a technology that's becoming more of a must-have than a nice-to-have for organizations around the globe.
When IT professionals talk about software defined data centers, the emphasis is typically on virtual servers — how they’re created, provisioned and maintained. But a truly smart software defined data center should be optimized as comprehensively as possible, across all resources. And among those, one of the most critical is storage.
The only constant in life is change, as the saying goes. We’ve seen this vividly in the technology industry in recent years with not one, but two major shifts: big data and cloud. The first drives a new set of workloads with challenging demands placed on the servers, and the second, a new consumption model for how to utilize those servers. Combined, these shifts place a broad set of new requirements on IT infrastructures.
At this year’s Google I/O developer conference, the technology giant shared its vision of a connected world where smart watches, smartphones, cars, laptops, televisions, and thermostats all interact seamlessly with one another.
Big data projects are bringing together sensitve data to enable a whole new class of business intelligence solutions. But putting all of that data together is not without risks. Apache Hadoop implementations that enable giant retailers, pharmaceutical firms and other large enterprises to apply analytics to an increasingly expanding volume of data are finding their early projects require data security and authentication controls to keep sensitive information from being exposed.