WebProNews

Tag: Cloud Computing

  • Microsoft Challenges Amazon Web Services With Cycle Computing Acquisition

    Microsoft Challenges Amazon Web Services With Cycle Computing Acquisition

    Among the top players in the tech industry, Amazon Web Services (AWS) is the undisputed king of cloud computing, leading the pack by a wide margin in the relatively new but lucrative segment. However, rivals do not plan to let AWS reign unchallenged. In fact, analysts believe that the Amazon will have to brace for a serious contender soon with Microsoft’s recent acquisition of Cycle Computing, which aims to revamp the services being offered by its own cloud computing platform, Microsoft Azure.

    Microsoft announced last Tuesday, that it acquired Cycle Computing, a high-performance computing, and cloud orchestration company known for its software that lets companies run massive apps in the cloud. While the acquisition cost was not disclosed, industry watchers agree that it was a brilliant move on Microsoft’s part as it will complement Azure, its cloud computing business.

    While Cycle Computing may not have brand recognition outside tech circles, it was instrumental in shaping cloud computing as we know it today, according to Business Insider. In 2012, Cycle transformed Amazon’s emerging Web Services into a supercomputing powerhouse with computing power said to be equivalent to 50,000 individual PCs.

    In fact, Cycle boasts of an elite clientele to this day, which includes top cloud vendors AWS, Microsoft Azure and Google. Other big clients using Cycle Computing’s suite of cloud orchestration products include NASA, Pacific Life, MetLife, Novartis, as well as various biotech, media, and manufacturing corporations.

    Just how far ahead is AWS in the cloud computing game? At the moment, reports indicate that Amazon’s cloud computing division enjoys the lion share of the public cloud computing pie with 41 percent of the market. Microsoft Azure is a distant second at 13 percent, followed by Google with its 7 percent and IBM with 5 percent market share.

    However, Microsoft’s acquisition of Cycle could shake things up. Azure has enjoyed explosive revenue growth since last year. Revenues rose by 97 percent for the quarter ending June 2017 with the prior three quarters registering stunning growth rates of 93%, 93%, and 116% respectively. Meanwhile, AWS managed to grow its revenue by 42 percent for the first half this year, which is impressive but is still relatively tame compared to Azure’s recent performance.

    With Cycle to support its efforts, Azure could close AWS’ lead even faster as it begins to offer more competitive products. But of course, the bets are Amazon is not going to take this challenge sitting down. Expect the battle for the cloud to heat up soon.

    [Featured Image by Microsoft]

  • Twas the Night Before Christmas and All Through the Cloud…

    Twas the Night Before Christmas and All Through the Cloud…

    The Google Cloud Team posted a fun poem for all of us techno nerds. “2016 is winding down, and we wanted to take this chance to thank you, our loyal readers, and wish you happy holidays,” wrote Alex Barrett, Editor of the Google Cloud Platform Blog. “As a little gift to you, here’s a poem, courtesy of Mary Koes, a product manager on the Stackdriver team channeling the Clement Clarke Moore classic.”

    Twas the night before Christmas and all through the Cloud
    Not a creature was deploying; it wasn’t allowed.
    The servers were all hosted in GCP or AWS
    And Stackdriver was monitoring them so no one was stressed.

    The engineers were nestled all snug in their beds
    While visions of dashboards danced in their heads.
    When then from my nightstand, there arose such a clatter,
    I silenced my phone and checked what was the matter.

    Elevated error rates and latency through the roof?
    At this rate our error budget soon would go poof!
    The Director OOO, the CTO on vacation,
    Who would I find still manning their workstation?

    Dutifully, I opened the incident channel on Slack
    And couldn’t believe when someone answered back.
    SClaus was the user name of this tireless engineer.
    I wasn’t aware that this guy even worked here.

    He wrote, “Wait while I check your Stackdriver yule Logs . . .
    Yep, it seems the errors are all coming from your blogs.”
    Then in Error Reporting, he found the root cause
    “Quota is updated. All fixed. :-)” typed SClaus.

    Who this merry DevOps elf was, I never shall know.
    For before we did our postmortem, away did he go.
    Just before vanishing, he took time to write,
    “Merry monitoring to all and to all a silent night!”
    Happy holidays everyone, and see you in 2017!

  • Microsoft Ends Moore’s Law, Builds a Supercomputer in the Cloud

    Microsoft Ends Moore’s Law, Builds a Supercomputer in the Cloud

    A group of Microsoft engineers have built an artificial intelligence technique called deep neural networks that will be deployed on Catapult by the end of 2016 to power Bing search results. They say that this AI supercomputer in the cloud will increase the speed and efficiency of Microsoft’s data centers and that their will be a noticeable difference obvious to Bing search engine users. They say that this is the “The slow but eventual end of Moore’s Law.”

    “Utilizing the FPGA chips, Microsoft engineering (Sitaram Lanka and Derek Chiou) teams can write their algorithms directly onto the hardware they are using, instead of using potentially less efficient software as the middle man,” notes Microsoft blogger Allison Linn. “What’s more, an FPGA can be reprogrammed at a moment’s notice to respond to new advances in artificial intelligence or meet another type of unexpected need in a datacenter.”

    The team created this system that uses a reprogrammable computer chip called a field programmable gate array (FPGA) that will significantly improve the speed of Bing and Azure queries. “This was a moonshot project that succeeded,” said Lanka.

    What they did was insert an FPGA directly between the network and the servers, which in bypassing the traditional software approach speeds up computation. “What we’ve done now is we’ve made the FPGA the front door,” said Derek Chiou, one of the Microsoft engineers who created the system. ““I think a lot of people don’t know what FPGAs are capable of.”

    Here is how the team described the technology:

    HyThe Cataputl Gen2 Card showing FPGA and Network ports enabling the Configurable Cloudperscale datacenter providers have struggled to balance the growing need for specialized hardware (efficiency) with the economic benefits of homogeneity (manageability).  In this paper we propose a new cloud architecture that uses reconfigurable logic to accelerate both network plane functions and applications.  This Configurable Cloud architecture places a layer of reconfigurable logic (FPGAs) between the network switches and the servers, enabling network flows to be programmably transformed at line rate, enabling acceleration of local applications running on the server, and enabling the FPGAs to communicate directly, at datacenter scale, to harvest remote FPGAs unused by their local servers.

    We deployed this design over a production server bed, and show how it can be used for both service acceleration (Web search ranking) and Hardware and Software compute planes in the Configurable Cloudnetwork acceleration (encryption of data in transit at high-speeds).  This architecture is much more scalable than prior work which used secondary rack-scale networks for inter-FPGA communication.  By coupling to the network plane, direct FPGA-to-FPGA messages can be achieved at comparable latency to previous work, without the secondary network.  Additionally, the scale of direct inter-FPGA messaging is much larger.  The average round-trip latencies observed in our measurements among 24, 1000, and 250,000 machines are under 3, 9, and 20 microseconds, respectively.   The Configurable Cloud architecture has been deployed at hyperscale in Microsoft’s production datacenters worldwide.

  • The Conversational Computing Revolution is Upon Us

    “We’ve long dreamed of talking computers,” noted Barry Briggs, consultant and former CTO for Microsoft, where he helped lead the company’s transition to the cloud and is generally known as a pioneer in the computing industry. What Briggs is referring to is advent of talking devices and conversational interfaces which are just now beginning to reshape how we use computers, and more importantly how we interact with data.

    Formerly, according to Briggs, talking computers (such as ELIZA) were more or less a trick. “After a time, because of the program’s simplicity, the novelty wears off,” he said.

    However, things are advancing so fast that conversational Star Trek style computer and device interaction is foreshadowing a transformative societal shift. Briggs said in January 2014, “The limitations are really gone. We have built software for decades now thinking about what are the limitations that the hardware or the amount of storage for the network place upon us. Those limitations don’t exist anymore.”

    Fast forward to today Briggs writes:

    “Because of the nearly limitless computing and storage capacity in the cloud, and because of great advances in AI, machine learning, speech recognition, and data storage and analytics, Weizenbaum’s primitive ELIZA program has evolved into something far more magical and useful,” says Briggs. “Perhaps, even, we’re at the advent of the next big shift in computing, fueled by artificial intelligence and built around a behavior that is most natural to humans: conversations.”

    Briggs sees bots at the advent of this conversational shift. “Want a pizza? Just ask Domino’s chatbot. Or PizzaHut’s chatbot. Need to get somewhere? Ask Uber.”  

    He wonders, “Can bots become the new UI?”

    “For business, the transformation of conversational computing is just beginning. As bots are connected to corporate databases, for example, they’ll simplify tasks from onsite repairs to scheduling meetings into simple conversational actions like, “What parts do I need to fix this?” or “What time is Customer X available next Monday?”

    Eventually, by taking advantage of the massive data storage and mining capabilities available in the cloud, bots will get to know you, providing intelligent suggestions like, “While you’re on site with the customer, I’d suggest examining the engine gearbox, I’m seeing some early failures in other installs,” and learning from previous experiences: “Did the fix I suggested last time help?”

    We’ve come a long way from Weizenbaum’s ELIZA. What started as a bit of sleight-of-hand programming has turned into an entirely new, intuitive and efficient way of interacting with computers. Conversational bots built on cloud-based artificial intelligence enable new frontiers in customer intimacy, simplify access to information, and help businesses and consumers make more informed decisions – quicker.”

    Read the full article at the Microsoft Transform Blog…

  • Microsoft Partners with Red Hat On Enterprise Linux for Azure

    Microsoft Partners with Red Hat On Enterprise Linux for Azure

    Microsoft announced on Wednesday that it has entered a partnership with Red Hat to include Red Hat solutions on Microsoft Azure. Red Hat Enterprise Linux is to be offered as the preferred choice for enterprise Linux workloads.

    The two companies are also working together to address enterprise, ISV and developer needs for building, deploying, and managing applications on Red Hat software across private and public clouds.

    Azure will soon become a Red Hat Certified Cloud and Service Provider so customers can run their Red Hat Enterprise Linux applications and workloads on it. Red Hat Cloud Access subscribers will be able to bring their own virtual machine images to run in Azure. Azure customers will be able to utilize Red Hat’s application platform, including Red Hat JBoss Enterprise Application Platform, Red Hat JBoss Web Server, Red Hat Gluster Storage, and OpenShift.

    Customers will also get the benefit of cross-platform support with both companies offering support in an integrated way. According to Microsoft, this is unlike any previous partnership in the public cloud. Support teams will actually reside on the same premises.

    “Red Hat CloudForms will interoperate with Microsoft Azure and Microsoft System Center Virtual Machine Manager, offering Red Hat CloudForms customers the ability to manage Red Hat Enterprise Linux on both Hyper-V and Microsoft Azure,” Microsoft says. “Support for managing Azure workloads from Red Hat CloudForms is expected to be added in the next few months, extending the existing System Center capabilities for managing Red Hat Enterprise Linux.”

    “Expanding on the preview of .NET on Linux announced by Microsoft in April, developers will have access to .NET technologies across Red Hat offerings, including Red Hat OpenShift and Red Hat Enterprise Linux, jointly backed by Microsoft and Red Hat,” it adds. “Red Hat Enterprise Linux will be the primary development and reference operating system for .NET Core on Linux.”

    Red Hat discusses the partnership more in a blog post, as does Microsoft. They’ll be hosting a webcast later on Wednesday to discuss it further.

    Image via Red Hat

  • Cisco Announces Metacloud Acquisition Plans

    Cisco Announces Metacloud Acquisition Plans

    Cisco just announced that it intends to acquire private cloud company Metacloud to accelerate its own “intercloud” strategy.

    Metacloud says the deal will not affect users’ environments, and that it already has account reps reaching out to customers to answer questions. It counts Tableau, Ooyala, SK Planet, and Tapjoy among its customers.

    “Metacloud’s OpenStack-based cloud platform will accelerate Cisco’s strategy to build the world’s largest global Intercloud, a network of clouds, together with key partners to address customer requirements for a globally distributed, highly secure cloud platform capable of meeting the robust demands of the Internet of Everything,” Cisco said in its announcement. “Since announcing its Intercloud strategy in March, Cisco has made rapid progress, enlisting key technology partners, service and cloud providers, all of whom are standardizing upon the Cisco Cloud Services architecture, which is based on OpenStack open source software for building private and public clouds.”

    “Cloud computing has dramatically changed the IT landscape. To enable greater business agility and lower costs, organizations are shifting from an on-premise IT structure to hybrid IT – a mix of private cloud, public cloud, and on-premise applications,” said Hilton Romanski, senior vice president, Cisco Corporate Development. “The resulting silos present a challenge to IT administrators, as choice, visibility, data sovereignty and protection in this world of many clouds requires an open platform. We believe Metacloud’s technology will play a critical role in enabling our customers to experience a seamless journey to a new world of many clouds, providing choice, flexibility, and data governance.”

    The company didn’t disclose how much it’s paying for Metacloud, but says it expcts the deal to close in the first quarter of fiscal year 2015.

    Metacloud employees will join Cisco’s Cloud Infrastructure and Managed Services organization after the deal is completed.

    Image via Metacloud

  • Cassette Tape That Can Hold 185TB Revealed by Sony

    The cassette tape has long been irrelevant in the consumer market, with the technology hanging on only in a few isolated hipster music circles. In storage media industry, however, cassettes have never really gone out of style. Magnetic tape is still one of the most reliable ways to archive large amounts of data, and the technology is still improving.

    Sony has announced that its new magnetic tape technology is designed to store more information on a single cassette than ever before. According to the company, the new technology has a “nano-grained magnetic layer with fine magnetic particles and uniform crystalline orientation.” What this means is a new cassette that can store more than 185TB of data on a single cartridge with a recording density of 148GB per square inch. This is, according to Sony, 74 times the amount of data that can be stored on the current highest-density magnetic tape storage cassettes.

    Sony officially announced its new magnetic tape tech earlier this week at the INTERMAG conference in Dresden, Germany. The announcement was made in conjunction with IBM, which measured the recording density of the new media for Sony.

    Sony is calling its new magnetic tape the “next generation” of tape storage media. The product was created by placing uniform layers of crystals on polymer film thinner than 5 micrometers. This was accomplished using a technique called “sputter deposition” and optimizing the technique to provide smooth layers of crystals that are uniform in size. The average thickness of each of these layers is 7.7 nanometers.

    Sony is betting that its new storage technology will be sorely needed in the growing age of cloud computing. Though end users can interact with cloud data in the magical-sounding way the marketing hype suggests, the companies behind these cloud products actually have to store many terabytes of data. With data storage facilities quickly growing, any chance to save space through greater storage density is likely to be popular with large data storage businesses.

    Image via Wikimedia Commons

  • Cloud Computing to See Revenues Surge

    Cloud Computing to See Revenues Surge

    The cloud may be the buzzword that businesses use to signal a forward-thinking approach to investors, but the concept is quickly becoming one of the most important in enterprise. Companies large and small will soon invest billions in cloud computing to establish a flexible platform for their operations.

    Market research firm IHS today issued a new report predicting that enterprise spending on cloud-related technologies in 2017 is set to be triple the amount spent on such technology in 2011.

    IHS estimates that spending on cloud computing technologies will reach $174.2 billion this year, a 20% increase over the $145.2 billion spent on the segment in 2013. The firm also estimates that cloud-related revenue will hit over 235 billion by the year 2017, up 35% from the firm’s 2014 spending estimates.

    Cloud-related spending, according to IHS, will include cloud-based services on which businesses can build both applications and infrastructure. The spending also includes the physical infrastructure related to cloud computing, including the huge number of servers that will be needed to support the segment’s growth.

    “With the cloud touching nearly every consumer and enterprise around the globe, spending for cloud-related storage, servers, applications and content will be dedicated toward building a framework that is rapidly scalable, highly dynamic, available on-demand and requiring minimal management,” said Jagdish Rebello, principal analyst for the cloud and big data at IHS. “The robust growth will come as an increasing number of large and small enterprises move more of their applications to the cloud, while also looking at data analytics to drive new insights into consumer behavior.”

  • Can Google Lure Businesses With Its Powerful Infrastructure And Lower Prices?

    Can billions of search results in milliseconds, 6 billion hours of YouTube video per month & storage for 425 million Gmail users be wrong?

    As if businesses weren’t relying on Google enough, the company took a major step toward gaining even more dependence from businesses this week with the launch of general availability of Google Compute Engine. It’s been called part of Google’s “quest to dominate the world and its entry into a “heavyweight competition in cloud computing“. One thing’s for sure. Google is courting businesses like never before.

    Are you interested in running your operations on Google Compute Engine? Let us know in the comments.

    Compute Engine is part of Google Cloud Platform. It enables businesses to run large-scale workloads on virtual machines utilizing Google’s own infrastructure. And that’s a powerful infrastructure.

    “You now have virtual machines that have the performance, reliability, security and scale of Google’s own infrastructure,” as Greg DeMichillie, director of product management, puts it. It includes thousands of miles of fiber optic cable. Data is automatically mirrored across storage devices in multiple locations.

    That’s the same infrastructure that lets Google return billions of search results in milliseconds, serve 6 billion hours of YouTube video per month and provide storage for 425 million Gmail users.

    Google is now competing directly with Amazon Web Services and Microsoft Windows Azure, among others.

    Google introduced Compute Engine at I/O last year, showing off how an app being used to research the genome that’s helping to find potential cancer cures. Our own Zach Walton recapped the presentation:

    Under current computational standards, Google pointed out that the research on the Genome Explorer app would take about 10 minutes to find each one match. To show off Compute Engine, they showed the same app being powered by 10,000 processor cores being powered by Google. This allows a match to be made every second.

    Compute Engine shows off the potential of cloud computing for research. A match a second wasn’t good enough for Google though and they showed a ticker that revealed there were now over 700,000 cores in Compute Engine. From there, they allotted 600,000 cores to the same genome app. Using that many cores, the app was able to discover multiple matches on a constant basis.

    Imagine how that kind of power can help a business scale. It’s been working well for companies like Snapchat, Cooladata, Mendelics, Evite and Wix.

    Thanks to the efforts of Google and its rivals, businesses of all sizes can get access to this kind of computing power relatively inexpensively. And with Google’s announcement this week, its prices just got cheaper. Google has lowered prices for standard instances by 10% in all regions.

    All machines types are charged for 10 minutes minimum, and then in 1 minute increments (rounded up to the nearest minute). Here’s a look the full pricing chart:

    Compute Engine pricing

    The general availability also comes with a 99.95% monthly SLA, 24/7 support, and support for all out-of-the-box Linux distributions (including SELinux and CoreOS) with any kernel or software (including Docker, FOG, xfs and aufs). Google also added support for SUSE and Red Hat Enterprise Linux (in Limited Preview) and FreeBSD.

    They’ve also added three new 16-core instance types in limited preview, transparent maintenance with live migration and automatic restart and faster, cheaper persistent disks.

    “At Google, we have found that regular maintenance of hardware and software infrastructure is critical to operating with a high level of reliability, security and performance,” says VP, Cloud Platform, Ari Balogh. “We’re introducing transparent maintenance that combines software and data center innovations with live migration technology to perform proactive maintenance while your virtual machines keep running. You now get all the benefits of regular updates and proactive maintenance without the downtime and reboots typically required. Furthermore, in the event of a failure, we automatically restart your VMs and get them back online in minutes. We’ve already rolled out this feature to our US zones, with others to follow in the coming months.”

    “Building highly scalable and reliable applications starts with using the right storage,” he says. “Our Persistent Disk service offers you strong, consistent performance along with much higher durability than local disks. Today we’re lowering the price of Persistent Disk by 60% per Gigabyte and dropping I/O charges so that you get a predictable, low price for your block storage device. I/O available to a volume scales linearly with size, and the largest Persistent Disk volumes have up to 700% higher peak I/O capability.”

    As if the competition for hosting your businesses data wasn’t hot enough already, Google with all of its data center might (which spans across the Americas, Asia and Europe) is now making its presence known, and will no doubt take advantage of the fact that businesses are already relying on Google for numerous other components of their operations.

    Google has a couple of case studies from customers using Compute Engine here.

    What do you think? Sold on Compute Engine? Prefer another provider? Let us know in the comments.

    Image: Google

  • IBM Soon To Offer Watson As Cloud Development Platform

    It has been quite a while since we have heard the name “Watson” in association with IBM. I am sure that for most of you, the last time that you heard of IBM’s Watson was his debut on Jeopardy!; however, Watson has been very busy lately. “He” has been working with Citigroup to analyze company data, actively learning the Urban Dictionary and now providing its cloud development platform to third-party businesses.

    According to ComputerWorld, the computing giant IBM will soon be offering access to Watson’s super-computing abilities to businesses in the hope that they will be creating more artificial-intelligence-based applications for their products and services. Watson’s “cognitive learning” talents will be utilized to help build these applications.

    Rob High, IBM’s CTO of Watson, stated the following in an interview with ComputerWorld:

    “We’ve been developing, evolving and maturing the technology. It’s stable and mature enough to support an ecosystem now. We’ve become convinced there’s something very special here and we shouldn’t be holding it back.”

    With its “cognitive learning” capabilities, Watson has been very beneficial and involved within the health care sector since his debut on Jeopardy! High elaborates “Cognitive systems are different in that they have the ability to simulate human behavior. For the most part humans have had to adapt to the computer. As we get into cognitive systems we open up the aperture to the computer adapting to the human.”

    With Watson’s cloud-computing options offered to businesses, IBM is offering the following tools:

    • Development Toolkit
    • Access to Watson’s API (Application Programming Interface)
    • Educational Materials/Documentation
    • Application Marketplace

    ComputerWorld states that some of the details for the Watson cloud service have yet to be finalized.

    [Image: YouTube]

  • Mozilla, OTOY Introduce ORBX.js, Brings Cloud-Based Graphics Processing To The Web

    In 2010, OnLive introduced a revolutionary idea – leverage the power of the cloud to stream PC games to any PC regardless of its specs. Now a number of Web companies are taking that idea even further with a new Web technology that does the same thing in any browser.

    Mozilla announced today that it has partnered with OTOY and Autodesk to bring a new HTML5 tool called ORBX.js to the Web. The Web technology allows HTML5 applications to leverage AWS to deliver graphics-intensive applications to any modern Web browser. Think of it like OnLive for the Web, but it can be used for more than just games.

    At the moment, ORBX.js is only available to AWS customers. That means those who use AWS can integrate the new Web technology into their app to deliver high-quality graphics-intensive applications to any compatible browser. One such application is Octane Cloud Workstation – Autodesk Edition – a Web app that will allow designers to create sophisticated 3D models without having to own their own powerful workstation. All the work is done on AWS’ servers and is then delivered seamlessly to the user’s PC or other compatible device.

    “Designers and engineers have an increasing need to be mobile, accessing the tools they need anytime, anywhere and from any device. Simple viewing and mark-up of documents is no longer sufficient – they need to be able to access powerful 3D design applications, be able to do real design work and not worry about sacrificing performance,” said Jeff Kowalski, Chief Technology Officer at Autodesk. “This is now possible with technology developed by Autodesk, AWS and OTOY.”

    Here’s a video demo of Octane Cloud running in Firefox:

    Despite the main emphasis of this technology being on graphics-intensive applications, like 3D modeling software, it’s said that ORBX.js can also be used to deliver state-of-the-art gaming experiences to people over the cloud.

    On a final note, OTOY is also offering a subscription service that gives users access to a top-of-the-line workstation PC over the Cloud for $9.99 a month. The demo you saw above was using the same cloud workstation PC to run Adobe Creative Cloud and Unreal Engine 4. You can find out more over at OTOY’s Web site.

    [Image: firefoxchannel/YouTube]

  • This Is How The Cloud Will Enhance Games On The Xbox One

    Cloud computing isn’t the mystical magic power that some make it out to be. It’s not going to make your games suddenly look better, or in the case of Sim City, play better. It will, however, be beneficial to the gaming experience in some unique ways.

    Ever since the Xbox One was announced, Microsoft has touted how cloud computing would enhance the gaming experience on its next-gen console. Now it’s finally explaining what that means in its latest interview with Dan Greenawalt, creative director at Forza Motorsport 5 developer Turn 10 Studios.

    Greenawalt says that cloud computing brings big data to gaming. So, what does that mean exactly? In Forza, he says that a player’s drivatar will be based upon their performance, and other players will race against AI opponents that will drive like you.

    As for other games, Greenawalt says that the same kind of big data can be using in pretty much any genre. The game can learn how others play and provide even better AI based upon those real players. Just imagine a single-player shooter where the AI enemies use formations and tactics used by human players in the multiplayer mode. It’s pretty exciting to think about.

    Of course, all of this is only being talked about for now without actually showing it in action. We’ll find out in November (maybe November 8?) how this all works.

    [Image: xbox/YouTube]

  • U.S.-Based Tech Firms May Lose Big Due To PRISM

    Cloud computing and storage is a multi-billion dollar business. Companies all over the world turn to Google, Microsoft and others to process and store their sensitive data. Corporate privacy policies ensure that this data remains secret, but that might not remain the case when the NSA comes knocking.

    A new study out of The Information Technology & Innovation Foundation has found that the recent revelations regarding PRISM – the NSA’s system of obtaining information from U.S.-based tech firms – has some foreign companies hesitant to do business with the likes of Google and Microsoft. Even before PRISM was leaked, there was already concern in the European community over how much information was being stored with American companies.

    That concern has now exploded into outright rejection as many European politicians and companies are calling for a boycott of American tech firms. The ITIF’s findings jibe with a recent poll conducted by The Cloud Security Alliance that found that 10 percent of 207 non-U.S. based tech firms had canceled their plans to utilize U.S.-based cloud services. Another 56 percent said that they were less likely to use those services.

    Overall, things are looking grim for the cloud computing business in the U.S. The ITIF report says that U.S.-based cloud services might lose a minimum of $21.5 billion over the next three years as more companies move their business to European and Asian tech firms. In a worst case scenario, the report says that U.S.-based cloud services could lose up to $35 billion by 2016.

    The ITIF ends its report with two recommendations for the U.S. government that it thinks will help restore faith in the nation’s tech industry:

    First, U.S. government needs to proactively set the record straight about what information it does and does not have access to and how this level of access compares to other countries. To do this effectively, it needs to continue to declassify information about the PRISM program and allow companies to reveal more details about what information has been requested of them by the government. The economic consequences of national security decisions should be part of the debate, and this cannot happen until more details about PRISM have been revealed.

    Second, the U.S. government should work to establish international transparency requirements so that it is clear what information U.S.-based and non-U.S.-based companies are disclosing to both domestic and foreign governments. For example, U.S. trade negotiators should work to include transparency requirements in trade agreements, including the Transatlantic Trade and Investment Partnership (TTIP) currently being negotiated with the EU.

    Some in the United States government are working to make the NSA more transparent, but they are constantly opposed by the Obama administration and the leaders of both parties in Congress. They argue that the NSA should remain unopposed and shrouded in secrecy for the safety of the nation. Opponents may want to start arguing that there’s safety in economic stability and keeping the NSA shrouded in secrecy threatens one of the nation’s fastest growing businesses.

  • The Pirate Bay Moves To The Pirate Cloud

    The Pirate Bay faithful were worried a few weeks ago when the torrent tracker went offline for two days. The folks running the show said it was a power outage that took longer than they had hoped to replace. The site is now back up and everything is fine. Well, the site did go down for five minutes yesterday, but it was for a good cause.

    The Pirate Bay has now officially moved to the cloud. They call it The Pirate Cloud, and it makes the site even more resilient to those who wish to shut down the tracker. Here’s the full statement from the The Pirate Bay:

    So, first we ditched the trackers.

    Then we got rid of the torrents.

    Now? Now we’ve gotten rid of the servers. Slowly and steadily we are getting rid of our earthly form and ascending into the next stage, the cloud.

    The cloud, or Brahman as the hindus call it, is the All, surrounding everything. It is everywhere; immaterial, yet very real.

    If there is data, there is The Pirate Bay.

    Our data flows around in thousands of clouds, in deeply encrypted forms, ready to be used when necessary. Earth bound nodes that transform the data are as deeply encrypted and reboot into a deadlock if not used for 8 hours.

    All attempts to attack The Pirate Bay from now on is an attack on everything and nothing. The site that you’re at will still be here, for as long as we want it to. Only in a higher form of being. A reality to us. A ghost to those who wish to harm us.

    Adapt or be forever forgotten beneath the veils of maya.

    Beyond the allusions to Hindu mythology, there’s some very real technical magic going on behind the scenes. The Pirate Bay told TorrentFreak that the entire site is now hosted via two cloud servers in two countries that run several virtual machine instances.

    So what happens if the host or the authorities find out where The Pirate Bay is being hosted? First of all, that would be almost impossible. The Pirate Bay is still operating a load balancer and transit routers across two countries that hide the identity of The Pirate Bay from its cloud hosts. If the host was able to somehow identify that they were hosting The Pirate Bay, they still wouldn’t be able to get any information out of the virtual machine due to the load balancer.

    To make things even more secure, the servers shut down if they can’t communicate with the load balancer for over eight hours. At that point, only the people with the encryption password can turn the servers back on.

    In short, The Pirate Bay is more resilient to outside attacks than ever before. The site has been moving towards this for a while now. The police raid on their former host may have even pushed them into adopting the cloud that much faster. The Pirate Bay isn’t going anywhere, and authorities would only be wasting even more money and resources to go after them now that they’ve moved to the cloud.

  • Microsoft Buys CiS Solutions Provider StorSimple

    Microsoft is acquiring Cloud-integrated Storage (CiS) solutions provider StorSimple. Microsoft says the pick-up will advance its Cloud OS vision and help customers embrace hybrid cloud computing more efficiently.

    “Customers faced with explosive growth in data are looking to the cloud to help them store, manage and archive that data. But, to be effective, cloud storage needs to integrate with IT’s current investments,” said Michael Park, corporate vice president, Server and Tools Division for Microsoft. “StorSimple’s approach helps customers seamlessly integrate on-premises storage with cloud storage through intelligent automation and management.”

    Park spoke further about the announcement on the Windows Azure blog, where he notes, “CiS is a rapidly emerging category of storage solutions that consolidate the management of primary data, backup disaster recovery and archival data, and deliver seamless integration between on premise and cloud environments. This seamless integration and orchestration enables new levels of speed, simplicity and reliability for backup and disaster recovery (DR) while reducing costs for both primary data and data protection.”

    “You may have heard us talk about the ‘Cloud OS’ over the last few months – the Cloud OS is our vision to deliver a consistent, intelligent and automated platform of compute, network and storage across a company’s datacenter, a service provider’s datacenter and the Windows Azure public cloud,” he notes. “With Windows Server 2012 and Windows Azure at its core, and System Center 2012 providing automation, orchestration and management capabilities, the Cloud OS helps customers transform their data centers for the future.”

    StorSimple CEO had this to say about the news: “Most StorSimple customers are mainstream IT organizations that have chosen Windows Azure as their primary cloud. We are excited to continue to work with Microsoft and bring the combined benefits of StorSimple and Windows Azure to customers around the world.”

    Terms of the deal have not been disclosed.

  • Amazon Web Services Is A Hit Among Government Agencies

    Amazon Web Services Is A Hit Among Government Agencies

    Amazon, Google and other cloud storage companies have found great success in the private sector by offering storage solutions to consumers and companies alike. Negotiating contracts with the public sector is an entirely different beast altogether, but Amazon has emerged as a major player in the field.

    Amazon announced today that over 300 government agencies and 1,500 public education institutions are now using Amazon Web Services. Some of these agencies and education institutions include major players like NASA, Centers for Disease Control and Prevention, MIT, University of Oxford, and the University of California – Berkeley.

    “Government agencies and education institutions are rapidly accelerating their adoption of the AWS Cloud as organizations worldwide realize that they can be more innovative, agile and efficient by using the cloud for their technology infrastructure,” said Teresa Carlson, Vice President of Worldwide Public Sector, AWS. “In addition, with initiatives such as the US Federal Cloud First mandate and the European Cloud Partnership, organizations are looking for ways to quickly move new and existing business and mission workloads to the cloud in a secure, compliant and cost-effective manner. With the new services and features added today in AWS GovCloud, public sector customers now have greater capabilities to rapidly design, build and deploy high performance applications with AWS’s scalable, secure, low-cost platform.”

    One of the more interesting uses of Amazon Web Services is the CDC BioSense 2.0. It’s a cloud-based public health monitoring service that allows the CDC to collect data from over 2,000 facilities and quickly respond to new health threats. The CDC said that the switching to the cloud is saving the agency is saving them money while creating additional jobs.

    As for public universities, Amazon Web Services has given out $4 million in grants to 350 universities in 35 countries to help implement AWS projects in the classroom and across campus. One of those programs is taking place at the University of San Francisco where students use AWS in the Masters in Analytics program to give them real-world experience in analyzing large quantities of data. Students can now run large-scale data sets without the need for additional expensive hardware.

    Amazon has obviously been busy in the public sector, but what about their main competition? Google Apps for Government, which includes Google Drive, has been adopted by governments in 45 states. A lot of agencies probably use both services, but it would be interesting to see the number of agencies who use Google App Engine over AWS for their cloud-based application needs.

    Either way, government agencies have been pushing hard for more cloud integration after the introduction of the Cloud First Mandate last year. The deadline for the mandate was June 8, 2012, but it remains to be seen how many government agencies actually moved their existing IT services to the cloud by that time. Regardless, AWS is obviously profiting from the government’s move to the cloud.

  • Microsoft Updates The Windows Azure SQL Database

    Microsoft has been very aggressive over the past year with its Windows Azure service. The cloud computing platform has been positioned as a worthy competitor to Google’s App Engine service, and Microsoft continually provides new updates to it. The latest update brings a number of new features to the SQL database.

    Microsoft recently detailed the major updates that hit the Windows Azure SQL Database. The updates add four new features in the areas of Linked Server support, recursive triggers and other goodies.

    First up, it was revealed that it’s now possible to add a Windows Azure SQL Database as a Linked Server, then it can used with Distributed Queries across both local and cloud databases. The new feature allows users to write queries that utilizes data from local networks as well as cloud data. This functionality was available in Windows Azure before, but it relied on a method that was not very good for performance. Here’s an example of how to connect to the Windows Azure SQL Database through Distributed Queries:

    Microsoft Updates The Windows Azure SQL Database

    Another new feature is expanded support for recursive triggers. Microsoft notes that triggers will now call themselves recursively by default, but it can be turned on on or off at the user’s discretion. Check out the documentation for more finformation.

    The SQL database also features support for DBCC SHOW_STATISTICS. This allows users to see the current query optimization statistics. Users can then estimate the “cardinality or number of rows in the query result.” From there, it creates a high quality query plan. You can check out the documentation here.

    Finally, Microsoft has added the ability to configure SQL database firewall rules at the database level. It’s a step up from the previous firewall management tools that only allowed users to set the rules at the server level. Implementation at the database level allows users to set different rules for different databases. You can check out the documentation here.

  • Taco Cloud Truly Is The Future Of Computing

    It was revealed recently that a lot of Americans don’t truly grasp what the Cloud is yet. In fact, some of them even believe that it can be disrupted by stormy weather. I blame the ignorance on a tech sector that does little to make the Cloud matter in our everyday lives. Thankfully, somebody has heard that call and has come up with the perfect solution to get America invested in Cloud computing.

    Say hello to Taco Cloud, a new service that can beam tacos straight to your microwave. It’s hard to describe the technical magic at work here so I’ll just let the creators do it for me:

    Taco Cloud takes your microwave to the next level by allowing Taco Vendors to upload their Tacos to the Cloud. The Tacos are then indexed by a complex taco algorithm and consumers can then search and buy their favorite tacos at the click of a button. Taco Cloud takes a small percentage of the transaction and charges Taco Vendors disposal fee for tacos that don’t sell. The Taco Storage Device is FDA approved and cleaned out once a week as the FDA recommends that a Taco’s shelf live not exceed a week.

    In all reality, Taco Cloud is a lot like Taco Copter. These are obviously fake products, but they serve an important purpose. It helps people to better understand what the Cloud is capable of. Replace the taco with 1GB of documents. You have a service that can transfer large files that were impossible to send over email thanks to Cloud storage.

    Even if Taco Cloud isn’t real, I do hope that technology can advance enough to achieve these kind of results one day. Particle disassembly and reassembly shouldn’t be that hard. I think that America would owe whoever truly invents the Taco Cloud first a debt of thanks.

    [h/t: Reddit]

  • Lenovo Acquires Cloud Computing Company Stoneware

    Lenovo announced today that it has acquired cloud computing services company Stoneware Inc. Lenovo states that cloud computing solutions are now a key component of its product portfolio. The company expects the acquisition to speed along its offerings, particularly “the ability to provide secure content across multiple devices in education and government.”

    Stoneware is a privately held company that develops cloud computing and classroom management software. It is currently headquartered in Indianapolis, Indiana and has 67 employees. The price paid for Stoneware has not been disclosed, but Lenovo did state the acquisition “is not material to Lenovo’s earnings.”

    “Adding Stoneware cloud computing into the Lenovo line up presents a significant opportunity to leverage their success, and enhance our PC Plus offerings, all to the benefit of our customers,“ said Peter Hortensius, president of the product group at Lenovo. “We have a history of innovation and embracing new technologies, and the talented team at Stoneware will fit in perfectly with our long-term strategy.”

    It seems that Lenovo will be focusing on the enterprise side of its business. Along with the acquisition announcement, Lenovo stated that it is “aggressively expanding its product offerings” to people and businesses, specifically emphasizing connectedness across multiple devices. Lenovo expects to leverage Stoneware’s software to help users connect PCs, tablets, and smartphones. The company “aims to offer” secure, end-to end solutions for business customers.

    “We are pleased to be joining forces with Lenovo,” said Rick German, CEO of Stoneware. “Lenovo is one of the largest and fastest growing technology companies in the world and for Stoneware, a small company with roots in the heartland of the United States, we are delighted to be given the opportunity to deliver real benefit to customers on a global stage.”

  • Amazon Web Services Opens Reserved Instance Resale Market

    Amazon Web Services Opens Reserved Instance Resale Market

    Amazon Web Services (AWS), Amazon’s cloud computing solutions provider, today announced the launch of its Reserved Instance Marketplace.

    Amazon’s Reserved Instances allow businesses to pay a one-time fee to reserve computing capacity for a specified term and receive a discount on the hourly rate for the instance. Now, customers will no longer have to worry as much about biting off more than they can chew.

    The Reserved Instance Marketplace will become a secondhand market for reserved instances. Customers will be able to sell their reserved instances to other businesses or buy reserved instances from other AWS customers. Amazon claims the new marketplace will enable customers to find a wider selection of reserved instance term lengths and prices than AWS sells.

    “AWS has long given customers multiple ways to use Amazon EC2 and save money,” said Peter De Santis, vice president of Amazon EC2. “For those wanting to pay for compute by the hour with no required upfront fee or ongoing commitment, Amazon EC2 offers On Demand Instances. For those willing to pay a small upfront fee in exchange for a substantial discount over On Demand Instances (up to 71%), Amazon EC2 offers Reserved Instances. As more and more customers have started buying Reserved Instances, they’ve asked for ways to sell their Reserved Instances to change instance types or where instances are located, and they’ve asked for more Reserved Instance term length options. The Reserved Instance Marketplace addresses both of these customer needs.”

    AWS earlier this year launched the AWS Marketplace, a storefront that allows customers to shop and price cloud computing solutions in much the same way customers would shop for books on Amazon. The AWS Reserved Instance Marketplace, when it finally becomes available, will be located here.

  • Google Adds Batch Queries, Excel Connector To BigQuery

    Google Adds Batch Queries, Excel Connector To BigQuery

    Google announced today that it has added a couple of new features to BigQuery, its Big Data analysis service. It now supports batch queries, and has a connector for Excel.

    “While BigQuery specializes in getting insights quickly, we understand that there are important, non-interactive queries, such as nightly reports, that businesses also need to run,” says product manager Ju-kay Kwek. “Now, you can designate a query as a batch query and it will complete within a few hours. “If you’re using BigQuery via our standard self-service model, you pay 2 cents per GB processed for batch queries and 3.5 cents per GB processed for interactive queries.”

    “Analysts and executives use spreadsheets to explore large data sets,” adds Kwek. “Last year, we launched the ability for BigQuery users to execute queries inside Google spreadsheets using the Google Apps Script integration. With the new BigQuery Connector for Excel, we’re now making it simpler to execute BigQuery queries using Microsoft® Excel. This connector takes advantage of Excel’s standard web query feature to eliminate the extra work of manually importing data and running queries directly within Excel. For instructions on how to download and use the connector, see the BigQuery Connector for Excel page.”

    Ryan Boyd, Developer Advocate for Cloud Data Services has more on using the new features on Google’s Developers Blog.