As artificial intelligence (AI) continues to evolve and improve, researchers are offering a dire warning, saying super-intelligent AI will be impossible to control.
AI is one of the most controversial technological developments. Its proponents claim it will revolutionize industries, solve a slew of the toughest problems and lead to the betterment of humankind. Its critics believe it represents an existential threat to humanity, and will eventually evolve beyond man’s ability to control it.
An international team of researchers are now saying AI will evolve beyond our ability to control it, based on theoretical calculations. In a paper published in the Journal of Artificial Intelligence Research, researchers Manuel Alfonseca, Manuel Cebrian, Antonio Fernandez Anta, Lorenzo Coviello, Andrés Abeliuk and Iyad Rahwan make the case “that total containment is, in principle, impossible, due to fundamental limits inherent to computing itself.”
While it may seem unlikely that AI could evolve to such a point, co-author Manuel Cebrian, Leader of the Digital Mobilization Group at the Center for Humans and Machines, Max Planck Institute for Human Development, argues that AI is already reaching this point to some degree.
A super-intelligent machine that controls the world sounds like science fiction. But there are already machines that perform certain important tasks independently without programmers fully understanding how they learned it. The question therefore arises whether this could at some point become uncontrollable and dangerous for humanity.
The study, entitled “Superintelligence Cannot be Contained: Lessons from Computability Theory,” could very well have far-reaching implications for AI research.
Researchers at Uber are proposing a new artificial intelligence (AI) language model that emphasizes positive, social interaction.
AI is one of the most important developments in the tech industry. Increasingly, is is being used in a wide array of fields, often with the goal to assist individuals in mundane tasks. Chat bots, support agents and conversational AIs are just a few examples. One challenge, however, is making AIs that people will engage with.
Researchers at Uber believe they have the solution, and have written a paper emphasizing the importance of developing an AI language model around positive, social interaction.
Goal-oriented conversational agents are becoming prevalent in our daily lives. For these systems to engage users and achieve their goals, they need to exhibit appropriate social behavior as well as provide informative replies that guide users through tasks.
The researchers hypothesized that an AI using positive interaction would encourage more engagement.
We, therefore, hypothesize that users would prefer a conversaional agent with more polite or more positive language and be more willing to engage with, respond to and persist in the interaction when conversing with an agent using polite and positive language.
Uber’s researchers tested their hypothesis in a ride-sharing environment, where new drivers’ on-boarding was guided by text messages from customer support representatives (CSR).
In this Study 1 we investigated whether and how social language is related to user engagement in task-oriented conversations. We used existing machine learning models to measure politeness and positivity in our analyses. The results show that the politeness level in CSR messages was positively correlated with driver’s responsiveness and completion of their first trip. We also found that positivity positively predicts driver’s first trip, but it has a negative relationship to driver responsiveness even after removing congratulatory milestone messages or messages that do not have any question mark, which usually have positive sentiment and/or do not require responses from drivers.
Uber’s research could be an important stepping stone in the ongoing development of AI, ensuring it best supports human needs.
Google Cloud has announced it has opened three new cloud regions, expanding its worldwide cloud presence.
Google currently stands in third place, behind AWS and Microsoft Azure, in the cloud market. CEO Thomas Kurian has made it his stated goal to lead Google Cloud into at least the number two spot within five years. A big step in that direction is expanding Google’s cloud regions.
“Today, we’re excited to announce the expansion of our global network with new cloud regions in Chile, Germany and Saudi Arabia,” writes Dave Stiver, Senior Product Manager, GeoExpansion. “When launched, each region will have three zones to protect against service disruptions, and include a portfolio of key Google Cloud products, while offering lower latency to nearby users and a more robust global network of regions for multinational enterprises. “
Having local cloud regions enables Google Cloud customers to better serve their own customers’ needs.
“Google Cloud is a strategic partner as we optimize our operations performance to better serve our customers around the world,” says Henning Krüger, VP Ops Suite at Lufthansa Group. “We’re digitizing our operations atop Google Cloud’s global infrastructure, and we’re using their machine learning capabilities to combine previously disparate systems and data feeds into one unified platform.”
Marking the 15th anniversary of the United Nations’ International Anti-Corruption Day, Microsoft has unveiled Anti-Corruption Technology and Solutions (ACTS).
Corruption continues to be a significant problem for both governments and the private sector. While many different methods of combating corruption have been tried, Microsoft believes artificial intelligence may hold the key.
“In the next decade, Microsoft ACTS will leverage the company’s investments in cloud computing, data visualization, AI, machine learning, and other emerging technologies to enhance transparency and to detect and deter corruption,” writes Dev Stahlkopf – Corporate Vice President and General Counsel. “We will endeavor to bring the most promising solutions to the broadest possible audience, using our partner networks, programs, and global employee base to scale solutions through careful consideration of their priorities, technical infrastructure, and capabilities.
“Over the last six months, we have already begun to make investments in support of the Microsoft ACTS initiative, including a partnership with the Inter-American Development Bank to advance anti-corruption, transparency, and integrity objectives in Latin America and the Caribbean. Announced in July 2020, we are partnering with the IDB Transparency Fund to help bring greater transparency to the use of Covid-19 economic stimulus funds, building on the Mapa Inversiones platform developed by the IDB with Microsoft support and already adopted by many countries in the region. In the coming months and years, we look forward to additional partnerships, learning as we go, and empowering the work of others.”
With UN estimates placing the cost of corruption at $3.6 trillion dollars a year, Microsoft ACTS will likely see widespread adoption. The technology illustrates yet another way in which AI can be put to good use.
Fully autonomous cars may not be here yet, but artificial intelligence (AI) is already piloting Loon’s fleet of internet-providing balloons.
Loon is one of Alphabet’s companies dedicated to providing internet access to rural and underserved areas. The company uses high-altitude balloons that drift in the stratosphere, 11 to 16 miles above the Earth’s surface, creating a wireless network to provide internet access.
The company announced in a blog post that it has turned over navigation of the balloons to Google’s deep reinforcement learning AI.
“In our ongoing efforts to improve Loon’s navigation system for our stratospheric connectivity mission, a small group of colleagues at Loon and Google AI had been working to develop a more powerful navigation system that leverages deep reinforcement learning (RL), which is a type of machine learning technique that enables an agent to learn by trial and error in an interactive environment using feedback from its own actions and experiences,” writes Salvatore Candido, Loon CTO. “This contrasts against the conventional approach of the automated system following fixed procedures artisanally crafted by engineers.”
While some may question whether the balloon’s RL qualifies as a true AI, Candido believes it has now crossed that line.
“In my last post about Loon’s navigation system, I asked the question of whether we were dealing with AI. My answer was uncertain,” continues Candido. “This time my answer is even more nuanced. While there is no chance that a super-pressure balloon drifting efficiently through the stratosphere will become sentient, we have transitioned from designing its navigation system ourselves to having computers construct it in a data-driven manner. Even if it’s not the beginning of an Asimov novel, it’s a good story and maybe something worth calling AI.”
Whatever degree of AI Loon’s navigation system achieves, it’s a significant development in the technology.
Informally, Industrial Revolutions are referred to as Industry “Points O’s.” The First Industrial Revolution, or Industry 1.0, took place between 1760 and 1830, the second following up shortly after between 1870-1914. Between 1950-2002, the world underwent “digitalization” as a result of the Third Industrial Revolution, or Industry 3.0; and since 2011, we have been undergoing the Fourth Industrial Revolution, more commonly known as Industry 4.0. As a result of digitalization, data intelligence has been a primary driver in prospective industry revolutions.
As a result of Industry 1.0, machines and tools were able to replace animal and human labor. This was especially monumental for its time (1760-1830). How? The use of iron and steel for machinery began to skyrocket. As a result, working class citizens were able to create new resources, such as steam and internal combustion engines – which went on to drive a sector of the economy in itself.
Under Industry 2.0 (1870-1914), workers of the mass production industry saw many days of sunshine. For the first time, assembly line efficiency and productivity was lightened and shipping was made easier due to the invention of railways and telegraphs – another product of Industry 2.0. More along, new materials such as stainless steel and plastics were introduced as societal benefits.
Things got more technological under Industry 3.0 (1950-2002). The Third Industrial Revolution introduced electronics and IT, as well integrated them into manufacturing procedures. As a result, society saw a massive rise in telecommunications, computers, and even nuclear power. There was also a noteworthy widespread in factory automation, such as the incorporation of robots and PLCs to contribute to the general workflow.
Since 2011, interconnectivity has been the key focus. Already, Industry 4.0 is set to provide higher-level automation driven by artificial intelligence, as well as optimized manufacturing using real-time data and sensors. Additionally, this Industrial Revolution is focusing on a way to integrate cyber-physical systems throughout the supply chain.
In its outcome, Industry 4.0 will have used big data and machine learning to automate plants, warehouses, machines, and more. Furthermore, Industry 4.0 will have created smart machines that will be capable of collecting and analyzing data, as well as communicating the right information at the right time.
In other words, the Fourth Industrial Revolution will lay improvements across 3 new sectors: smart communication, data quality, and smart devices.
Smart communication allows manufacturers to rapidly respond to changing demand, inventory shortfalls, or equipment faults. Data Quality helps companies quickly locate problems so they can respond to them quicker. Additionally, data quality can be refined through organization-wide networks. Smart Devices create increasingly autonomous ecosystems that act as a catalyst for the future of the industry. Examples of these include driverless vehicles and drones. Driverless vehicles can navigate factories and warehouses, and drones can be used for maintenance and inventory management.
With that being said, business owners should seek insights on the ways they’re being impacted by Industry 4.0 In other words, this is a great time to prepare an effective data structure, focus on high-fidelity data creation and communication, standardardized business and data process, and understand your business’ use case. If even one important portion of the data is missing, it can break the digital thread – causing the flow of data to stop.
Is your data ready for Industry 4.0? Find out in the infographic below.
Amazon has migrated the majority of Alexa to the next generation of its custom silicon chips.
Last year, Amazon was reported to be working on the next generation of its ARM-based custom silicon, as it works to improve cost, performance and efficiency. The company’s latest effort is the AWS Inferentia, with four NeuronCores. The NeuronCores are designed to speed up deep learning operations, making them an ideal option for powering Alexa.
“Today, we are announcing that the Amazon Alexa team has migrated the vast majority of their GPU-based machine learning inference workloads to Amazon Elastic Compute Cloud (EC2) Inf1 instances, powered by AWS Inferentia,” writes Sébastien Stormacq. “This resulted in 25% lower end-to-end latency, and 30% lower cost compared to GPU-based instances for Alexa’s text-to-speech workloads. The lower latency allows Alexa engineers to innovate with more complex algorithms and to improve the overall Alexa experience for our customers.
“AWS built AWS Inferentia chips from the ground up to provide the lowest-cost machine learning (ML) inference in the cloud. They power the Inf1 instances that we launched at AWS re:Invent 2019. Inf1 instances provide up to 30% higher throughput and up to 45% lower cost per inference compared to GPU-based G4 instances, which were, before Inf1, the lowest-cost instances in the cloud for ML inference.”
This has been a big week for custom silicon, between Apple unveiling its first Macs running on its M1 chip Tuesday, and now AWS’ announcement.
Researchers at Carnegie Mellon University have created a machine learning model to detect the direction of an incoming voice.
Current smart speakers and voice-activated devices rely on activation keywords to listen and then respond to commands. While largely effective, it can create problems when there are multiple devices that use the same keyword, or when someone uses that keyword in normal conversation.
The researchers at Carnegie Mellon University set out to solve this problem by using machine learning to help address the problem of addressability. In other words, help devices know if a command was directed at them specifically.
The research aimed to recreate elements of human-human communication, specifically how people can address a specific person in a crowded room. If computers can learn directional conversation, it will make it much easier to control devices and interact with them much like interacting with a human being.
“In this research, we explored the use of speech as a directional communication channel,” write (PDF) researchers Karan Ahuja, Andy Kong, Mayank Goel and Chris Harrison. “In addition to receiving and processing spoken content, we propose that devices also infer the Direction of Voice (DoV). Note this is different from Direction of Arrival (DoA) algorithms, which calculate from where a voice originated. In contrast, DoV calculates the direction along which a voice was projected.
“Such DoV estimation innately enables voice commands with addressability, in a similar way to gaze, but without the need for cameras. This allows users to easily and naturally interact with diverse ecosystems of voice-enabled devices, whereas today’s voice interactions suffer from multi-device confusion.”
This research is an important development and could have a profound impact on how humans interact with everything from smart speakers to more advanced AIs.
More evidence would suggest that Apple is working on its own search engine to help challenge Google’s dominance.
In many ways, the new report doesn’t add much to previous reports from August. When the news first broke, Coywolf founder Jon Henshaw noticed a web crawler called AppleBot crawling his website. At the same time, AppleInsider noticed changes in how iOS 14 handled search vs iOS 13.
Now the Financial Times says that multiple search experts are saying that Applebot is showing a steep increase in activity. The company also points to Apple’s poaching of John Giannandrea, Google’s head of search, two and a half years ago. He currently serves as Apple’s senior vice president of Machine Learning and AI Strategy, putting him in a strategic position to have a significant impact on the company’s efforts.
Other experts believe Apple has the technical expertise to build a successful search engine.
“They [Apple] have a credible team that I think has the experience and the depth, if they wanted to, to build a more general search engine,” said Bill Coughran, Google’s former engineering chief, according to FT.
The timing may ultimately work in Apple’s favor as the company’s deal with Google, to make its search engine the default on iOS, is one of the factors in the government’s antitrust lawsuit against Google.
T-Mobile has launched T-Mobile Ventures, a fund aimed at backing companies “developing transformative 5G products and services for the T-Mobile network.”
Carriers around the country are rushing to roll out 5G networks, while businesses and customers are eager to take advantage of the benefits it offers. T-Mobile is one of the leading 5G providers, offing the full range of 5G: low-band, mid-band and mmWave.
Because of the speeds 5G offers, it is opening up new opportunities in artificial intelligence, edge computing, cloud computing, machine learning and more. As a result, a new generation of companies are developing products and services that take advantage of 5G. T-Mobile Ventures’ goal is to help these companies succeed.
“T-Mobile Ventures is part of our mission to give customers the best 5G network in the country – one that will serve all Americans, stimulate competition and create tremendous economic value,” said Jason Young, Senior Vice President of Partnerships and T-Mobile Ventures. “With our 5G network at the foundation, we see massive opportunity across both business and consumer segments, and we’re excited to help fuel the wave of 5G applications coming to market in the years ahead.”
Verizon partners with Microsoft to create new ways for enterprises to accelerate the delivery of fast and secure 5G applications to enable state of the art low-latency IoT solutions.
Verizon’s on-site 5G Edge network integrated with Azure edge services can enable ultra-low latency, many times faster than the blink of an eye, according to Verizon, which can help businesses tap into real-time data analysis and delivery. Applications incorporating computer vision, augmented, mixed and virtual reality, digital twins or machine learning can be enhanced with 5G and MEC on the customer premise, helping transform the way industries such as retail, transportation, and logistics operations.
Think of automated high-precision asset localization, tracking and positioning in manufacturing. In healthcare, the increased speed, reduced latency and high bandwidth connectivity of 5G networks could enable real-time precision medicine leveraging mixed reality and AI capabilities as well as seamless and fast sharing of large files to improve patient care.
“We have built a network that provides real-world, 5G-enabled solutions TODAY,” said Rima Qureshi, EVP and Chief Strategy Officer at Verizon. “By bringing together Verizon’s 5G network and on-site 5G Edge platform with Microsoft’s expertise in cloud services, we will enable the development of the next-generation technologies everyone has been envisioning.”
The collaboration brings Azure cloud and edge capabilities together with Verizon’s on-site 5G Edge, a mobile edge computing platform designed to enable developers to build applications for mobile end-users and wireless edge devices with ultra-low latency. By utilizing on-site private 5G, businesses will be able to realize increased power efficiencies and reduced costs of end user devices while addressing their privacy and security needs.
Logistics and supply chain solutions company Ice Mobility is already testing on Verizon’s on-site 5G Edge platform, integrated with Microsoft Azure. The company is using 5G and MEC to help with computer vision assisted product packing. By gathering data in near real-time on product packing errors, the company has the potential to improve on-site quality assurance and save 15% to 30% in processing time.
“We are especially excited to join Verizon and Microsoft to test how 5G and MEC can improve the quality assurance process,” said Mike Mohr, CEO of Ice Mobility. “They truly have listened to our needs to provide automated real-time quality oversight and feedback, which will enable us to cost-effectively launch unique new products, while maintaining the highest execution standards, significantly increasing throughput and reducing costs. And, this is just the beginning.”
“By leveraging Verizon’s 5G network integrated with Microsoft’s cloud and edge capabilities, developers and businesses can benefit from fast, secure and reliable connections to deliver seamless digital experiences from massive industrial IoT workloads to precision medicine,” said Yousef Khalidi, corporate vice president Azure for Operators at Microsoft.
Moving forward, Verizon will explore opportunities to co-innovate with Microsoft to deliver new value to industries ranging from manufacturing to healthcare.
Verizon’s 5G Ultra Wideband network enables throughput at least 25 times faster than today’s 4G networks*; delivers ultra-low latency; and offers very high bandwidth. Verizon 5G Ultra Wideband is expected to eventually enable 100 times larger data volumes than 4G; and the ability to connect more than a million devices per kilometer. Verizon’s 5G Ultra Wideband service is available to people in 55 cities and its 5G Nationwide service is available to more than 200 million people in more than 1,800 cities around the U.S.
IBM and ServiceNow are partnering to provide enterprise solutions that utilize AI to automate IT operations. The new joint solution combines IBM’s AI‑powered hybrid cloud software and professional services to ServiceNow’s intelligent workflow capabilities and IT service and operations management products. The solution raises up deep AI‑driven insights from their data and then recommends actions for IT organizations to take that help them prevent and fix IT issues at scale.
“AI is one of the biggest forces driving change in the IT industry to the extent that every company is swiftly becoming an AI company,” said Arvind Krishna, Chief Executive Officer, IBM. “By partnering with ServiceNow and their market-leading Now Platform, clients will be able to use AI to quickly mitigate unforeseen IT incident costs. Watson AIOps with ServiceNow’s Now Platform is a powerful new way for clients to use automation to transform their IT operations.”
“For every CEO, digital transformation has gone from opportunity to necessity,” said ServiceNow CEO Bill McDermott. “As ServiceNow leads the workflow revolution, our partnership with IBM combines the intelligent automation capabilities of the Now Platform with the power of Watson AIOps. We are focused on driving a generational step improvement in productivity, innovation, and growth. ServiceNow and IBM are helping customers meet the digital demands of 21st-century business.”
ServiceNow says that in today’s technology‑driven organization, even the smallest outages can cause massive economic impact for both lost revenue and reputation. They note that this partnership will help customers address these challenges and help avoid unnecessary loss of revenue and reputation by automating old, manual IT processes and increasing IT productivity.
Here is what IBM and ServiceNow are planning:
Joint Solution: IBM and ServiceNow will deliver a first of its kind joint IT solution that marries IBM Watson AIOps with ServiceNow’s intelligent workflow capabilities and market‑leading ITSM and ITOM Visibility products to help customers prevent and fix IT issues at scale. Now, businesses that use ServiceNow ITSM can push historical incident data into the deep machine learning algorithms of Watson AIOps to create a baseline of their normal IT environment, while simultaneously having the ability to help them identify anomalies outside of that normal, which could take a human up to 60% longer to manually identify, according to initial results from specific Watson AIOps early adopter clients. The joint solution will position customers to enhance employee productivity, obtain greater visibility into their operational footprint and respond to incidents and issues faster.
Specific product capabilities will include:
ServiceNow ITSM allows IT to deliver scalable services on a single cloud platform estimated to increase productivity by 20%.
ServiceNow ITOM Visibility automatically delivers near real‑time visibility from a native Configuration Management Database, into all resources and the true operational state of all business services.
IBM Watson AIOps uses AI to automate how enterprises detect, diagnose, and respond to, and remediate IT anomalies in real time. The solution is designed to help CIOs make more informed decisions when predicting and shaping future outcomes, focus resources on higher‑value work and build more responsive and intelligent applications that can stay up and running longer. Using Watson AIOps, the average time to resolve incidents was reduced by 65 percent, according to one recent initial proof of concept project with a client.
Services: IBM is expanding its global ServiceNow business to include additional capabilities that provide advisory, implementation, and managed services on the Now Platform. Highly‑skilled IBM practitioners will apply their expertise to facilitate rapid delivery of valuable insights and innovation to clients. IBM Services professionals also will introduce clients to intelligent workflows to help improve resiliency and reduce IT risk. ServiceNow is co‑investing in training and certification of IBM employees and dedicated staff for customer success.
“Businesses are facing increased pressures to match the digital pace of a cloud‑first market in order to meet the demands of their customers,” said Stephen Elliot, program vice president, DevOps, and Management Software, IDC. “The C‑ suite is transforming workflows to deliver insights and automation for more efficient customer engagement models and cost containment strategies for the business while simplifying IT operations and increasing collaboration between IT and business stakeholders.”
Microsoft has ended support for Office 2010, as well as Office 2016 for Mac, and is instead pushing users toward Microsoft 365.
Office 2010 is one of the most popular versions of the venerable office suite. In fact, as recently as 2017, a survey showed it was in use among 83% of organizations around the world.
In spite of that, Microsoft has officially ended support for Office 2010, as well as the corresponding Office 2016 for Mac. Jared Spataro, Corporate Vice President for Microsoft 365, explainedthe decision:
As we first announced back in April 2017, this decision aligns with our broader commitment to providing tools and experiences designed for a new world of work. If this year has taught us anything, it’s that we need to help our customers stay agile and connected despite constant change. And that means delivering cloud-connected and always up-to-date versions of our most valuable apps to every person and every organization on the planet. With Microsoft 365 Apps, we do that in three big ways. First, the cloud enables real-time collaboration across apps and within Microsoft Teams, the hub for teamwork. Second, AI and machine learning advance creativity and innovation in everything from PowerPoint design to Excel analysis. And finally, built-in, cloud-powered security protects your data and provides the peace of mind that comes with knowing your business will not only be productive, but also secured.
We understand that everyone is at a different stage of their journey to the cloud, and we’re committed to supporting our customers throughout their transition to Microsoft 365 Apps. For those customers who aren’t ready for the cloud and have a specific need for on-premises or hybrid deployment, such as fully disconnected or restricted environments, we offer Office 2019, the perpetual version of Office that does not receive feature updates. But for everyone else, we’ve created a set of resources to help you transition to the Microsoft 365 Apps and innovations designed to help keep your environment up to date once you’ve made the transition.
As more companies move to the cloud, as well as engage in remote work, Microsoft 365 is increasingly becoming a critical option for many companies. This move will no doubt accelerate its adoption.
“Today starts a new chapter in our close collaboration with the telecommunications industry to unlock the power of 5G and bring cloud and edge closer than ever,” said Microsoft Azure Executive Vice President Jason Zander in a blog announcement. “We’re building a carrier-grade cloud and bringing more Microsoft technology to the operator’s edge. This, in combination with our developer ecosystem, will help operators to future proof their networks, drive down costs, and create new services and business models.”
Jason Zander, Executive Vice President, Microsoft Azure, announces new collaborations with the telecommunications industry that will unlock the power of 5G and bring cloud and edge closer than ever:
The increasing demand for always-on connectivity, immersive experiences, secure collaboration, and remote human relationships is pushing networks to their limits, while the market is driving down price. The network infrastructure must ensure operators are able to optimize costs and gain efficiencies, while enabling the development of personalized and differentiated services. To address the requirements of rolling out 5G, operators will face strong challenges, including high capital expenditure (CapEx) investments, an increased need for scale, automation, and secure management of the massive volume of data it will generate.
Today starts a new chapter in our close collaboration with the telecommunications industry to unlock the power of 5G and bring cloud and edge closer than ever. We’re building a carrier-grade cloud and bringing more Microsoft technology to the operator’s edge. This, in combination with our developer ecosystem, will help operators to future proof their networks, drive down costs, and create new services and business models.
In Microsoft, operators get a trusted partner who will empower them to unlock the potential of 5G. Enabling them to offer a range of new services such as ultra-reliable low-latency connectivity, mixed reality communications services, network slicing, and highly scalable IoT applications to transform entire industries and communities.
By harnessing the power of Microsoft Azure, on their edge, or in the cloud, operators can transition to a more flexible and scalable model, drive down infrastructure cost, use AI and machine learning (ML) to automate operations and create service differentiation. Furthermore, a hybrid and hyper-scale infrastructure will provide operators with the agility they need to rapidly innovate and experiment with new 5G services on a programmable network.
More specifically, we will further support operators as they evolve their infrastructure and operations using technologies such as software-defined networking, network function virtualization, and service-based architectures. We are bringing to market a carrier-grade platform for edge and cloud to support the operator’s goals to future proof their infrastructure with disaggregated, and containerized network architectures. Recognizing that not everything will move to the public cloud, we will meet operators where they are—whether at the enterprise edge, the network edge, or in the cloud.
Our approach is built on the acquisitions of industry leaders in cloud-native network functions—Affirmed Networks and Metaswitch and on the development of Azure Edge Zones. By bringing together hundreds of engineers with deep experience in the telecommunications space, we are ensuring that our product development process is catering to the most relevant networking needs of the operators. We will leverage the strengths of Microsoft to extend and enhance the current capabilities of industry-leading products such as Affirmed’s 5G core and Metaswitch’s UC portfolio. These capabilities, combined with Microsoft’s broad developer ecosystem and deep business to business partnership programs, provide Microsoft with a unique ability to support the operators as they seek to monetize the capabilities of their networks.
Your customer, your service, powered by our technology
As we build out our partnerships with different operators, it is clear to us that there will be different approaches to technology adoption based on business needs. Some operators may choose to adopt the Azure platform and select a varied mix of virtualized or containerized network function providers. We also have operators that have requested complete end-to-end services as components for their offers. As a part of these discussions, many operators have identified points of control that are important to them, for example:
Control over where a slice, network API, or function is presented to the customer.
Definition of where and how traffic enters and exits their network.
Visibility and control over where key functions are executed for a given customer scenario.
Configuration and performance parameters of core network functions.
As we build out Azure for Operators, we recognize the importance of ensuring operators have the control and visibility they require to manage their unique industry requirements. To that end, here is how our assets come together to provide operators with the platform they need.
Interconnect
It starts with the ability to interconnect deeply with the operator’s network around the globe. We have one of the largest networks that connect with operators at more than 170 points of presence and over 20,000 peering connections around the globe, putting direct connectivity within 25 miles of 85 percent of the world’s GDP. More than 200 operators have already chosen to integrate with the Azure network through our ExpressRoute service, enabling enterprises and partners to link their corporate networks privately and securely to Azure services. We also provide additional routes to connect to the service through options as varied as satellite connectivity and TV White Space spectrum.
Edge platform
This reach helps us to supply operators with cloud computing options that meet the customer wherever those capabilities are needed: at the enterprise edge, the network edge, the network core, or in the cloud. The various form factors, optimized to support the location in which they are deployed, are supported by the Azure platform—providing virtual machine and container services with a common management framework, DevOps support, and security control.
Network functions
We believe in an open platform that leverages the strengths of our partners. Our solutions are a combination of virtualized and containerized services as composable functions, developed by us and by our Network Equipment Provider partners, to support operators’ services such as the Radio Access Network, Mobile Packet Core, Voice and Interconnect services, and other network functions.
Technology from Affirmed and Metaswitch Networks will provide services for Mobile Packet Core, Voice, and Interconnect services.
Cloud solutions and Azure IoT for operators
By exposing these services through the Azure platform, we can combine them with other Azure capabilities such as Azure Cognitive Services (used by more than 1 million developers processing more than 10 billion transaction per day), Azure Machine Learning, and Azure IoT, to bring the power of AI and automation to the delivery of network services. These capabilities, in concert with our partnerships with OSS and BSS providers, enables us to help operators streamline and simplify operations, create new services to monetize the network, and gain greater insights into customer behavior.
In IoT our primary focus is simplifying our solutions to accelerate what we can do together from the edge to the cloud. We’ve done so by creating a platform that provides simple and secure provisioning of applications and devices to Azure cloud solutions through Azure IoT Central, which is the fastest and easiest way to build IoT solutions at scale. IoT Central enables customers to provision an IoT app in seconds, customize it in hours, and go to production the same day. IoT Plug and Play dramatically simplifies all aspects of IoT device support and provides devices that “just work” with any solution and is the perfect complement to achieve speed and simplicity through IoT Central. Azure IoT Central also gives the Mobile Operator the opportunity to monetize more of the IoT solution and puts them in a position to be a re-seller of the IoT Central application platform through their own solutions. Learn more about using Azure IoT for operators here.
Cellular connectivity is increasingly important for IoT solutions and represents a vast and generational shift for mobile operators as the share of devices in market shifts towards the enterprise. We will continue our deep partnership with operators to enable fast and efficient app development and deployment, which is critical to success at the edge. This will help support scenarios such as asset tracking across industries, manufacturing and distribution of smart products, and responsive supply chains. It will also help support scenarios where things are geographically dispersed, such as smart city automation, utility monitoring, and precision agriculture.
Where we go next
Our early engagement with partners such as Telstra and Etisalat helped us shape this path. We joined the 5G Open Innovation Lab as the founding public cloud partner to accelerate enterprise startups and launch new innovations to foster new 5G use cases with even greater access to leading-edge networks. The Lab will create long-term, sustainable developer and commercial ecosystems that will accelerate the delivery of exciting new capabilities at the edge, including pervasive IoT intelligence and immersive mixed reality. And this is just the beginning. I invite you to learn more about our solutions and watch the series of videos we have curated for you.
Baidu has unveiled Quantum Leaf, a new cloud-based quantum computing platform at its Baidu World 2020 developer conference.
Quantum computing is the next big evolution of the computing industry. Quantum computing promises to usher in a new era of computing and will upend industries as a result. Cryptography, artificial intelligence and physics are just a few of the fields that will be impacted.
Baidu had previously announced Paddle Quantum, “a quantum machine learning development toolkit based on PaddlePaddle that can help scientists and developers quickly build and train quantum neural network models and provide advanced quantum computing applications.”
Now the company has built on that with the release of Quantum Leaf, “a new cloud-native quantum computing platform named Quantum Leaf. It is used for programming, simulating and executing quantum computers, aimed at providing the quantum programming environment for Quantum infrastructure as a Service (QaaS).”
The news comes as an increasing number of companies are offering cloud-based quantum computing, one of the most recent being Xanadu.
“There are huge benefits to collaboration,” says Proofpoint CEO Gary Steele. “However, I do believe fundamentally that this work from home economy that we’re living in is going to change the face of work. You’re going to see a blend. Security leaders and organizations are going to need to figure out how do you defend people when they are sitting at home working from their couch just doing their job and doing it well?”
New AI/ML Innovations Block Bogus Emails
One of the big investments for us in this people-centric framework is to help organizations protect the data that people create. We’re giving companies more visibility and more controls to ensure that when you’re sitting in front of your couch and working from home that you’re not treating data in a way that’s going to ultimately hurt the company. For those individuals that are doing something malicious, we’re going to help companies find those malicious individuals.
We need to block (bogus emails that are supposedly from a trusted source) so that an individual doesn’t actually receive that message (in the first place). That is an impersonation. That’s how we’re applying new innovations in the AI/ML (artificial intelligence & machine learning) space to be able to identify those very sophisticated attacks and block them so that a poor user is not trying to figure out (if it is really) the CEO that asked me to do something that they shouldn’t do.
Twitter CFO Ned Segal says that Twitter’s work over the last couple of years designed to improve advertiser ROI and success on the platform is starting to resonate with advertisers.
Twitter CFO Ned Segal recently discussed Twitter’s advertising initiatives, how they are dealing with suspicious accounts and how they are working to improve the health of the conversation on the platform in a wide-ranging interview on Bloomberg:
Twitter Strategy of Increasing ROI for Advertisers is Paying Off
There are a couple of things at play here and you really have to go back to the strategy we rolled out a couple years ago; to have better ad formats, to drive better relevance for advertisers, to do a better job of measuring their success on Twitter and ultimately to get a better ROI for them on the platform than they were getting before.
Also, it’s been important to articulate why they should advertise on Twitter. Twitter is the place where you launch something new, Twitter is the place where you go to advertise to the most valuable audience when they are most receptive. We were just not clear about that a couple years ago as we are today and it’s really started to resonate.
We were pretty surprised at how quickly the business has turned around in the United States. It turned faster than we thought it would this quarter and in a bigger way than we expected it to and that was a big part of what drove the business this quarter.
More Sophisticated About Suspicious Accounts
We are challenging a ton more accounts than we used to so that 9 million number is something that Jack Dorsey talked about in front of Congress back in September. We have become more sophisticated in our understanding of how people create spammy and suspicious accounts so that we can detect or prevent their creation or stop them after they’ve been created. How many get through really depends on how many should get through.
We test far more accounts that are spammy and suspicious and that helps us understand the behavior. Just because an account is created on a web browser in a certain country with a certain IP address doesn’t necessarily mean that it shouldn’t be on the platform as another one might. There’s a lot that goes into it.
Twitter Prioritizing Safety Above All Else
We said that MAU (Monthly Average Users) would decline in the mid-seven digit millions in the 4th quarter because of GDPR, our ongoing health work, and decisions we might make around SMS contracts we have with carriers. We don’t forecast MAU out further than a quarter. We’ve done it each of the last two quarters because we could see a decline coming and we wanted to share it with people.
When we step back and think about our health work more broadly we don’t want to be constrained by the constrained metrics. We want to prioritize health above all else because we know it’s a critical growth factor for the company to make sure that Twitter is a safe place for you and me and for the people who should be on the platform and that removing spammy and suspicious behavior whenever we can.
Sometimes it affects the disclosed metrics and other times, such as in the 2nd quarter when we removed tens of millions of accounts and they were largely inactive, it doesn’t affect the disclosed metrics as much.
Still Work To Do to Improve the Health of the Conversation
We still have work to do to Improve the health of the conversation on Twitter. There are so many ways for us to address these challenges as people get more sophisticated in how they create the bad behavior on Twitter.
One of the great things about Twitter that we are able to benefit from is because it is public, open, and real-time, we often find things but frequently things are corrected by the platform itself. Other people on Twitter who say ‘that’s not true’ or ‘you may believe that but I believe something different and I want to tell you what I believe.’ The fact that the platform is open really makes a difference and allows us to take a different approach around policies and enforcement than others make.
Twitter is Public, Open, and Real-Time
Those are things that allow people to see what a public figure is going to say regardless of their party affiliation, regardless of where they are in the world. They can learn from it, they can respond to it, and they can observe how others might respond to it. We believe that allows for a healthy public conversation, that allows people to have more information than they otherwise might.
Whether it’s something here in the United States or it’s around the Brazillian or Mexican elections which were just completed, it’s an important part of our purpose to serve a public conversation which means people can see what other people are saying.
Health of the Conversation is Our Number One Priority
Health is our number one priority. We think about health, growing audience, improving our revenue product, and sales as our biggest priorities. I don’t expect those to change much as we move into next year and I think because Twitter is public, open, and real-time in nature we are able to leverage those characteristics and still accomplish a lot through our Twitter services team and through the machine learning that we use to amplify our policies and the Twitter services team.
We have been adding people. We are growing to grow our headcount by about 15 percent this year as we continue to invest against all of our priorities. It’s not against any one priority, but it’s against all of those priorities to be able to grow the business and execute against the opportunities that we see.
Delivering a Better Twitter During an Election
We have learned a lot from past elections, whether they are in the United States or in other parts of the world, and we’ve made a bunch of changes which we feel really good about. I will give you two examples. One is where we have the Ad Transparency Center, which is a place you can go on Twitter to see who is advertising around an election what it is that they are saying, to whom they are advertising, and how much they are paying for the impressions that they are getting. This is unprecedented transparency and to us, it’s critical to inform the public conversation.
Another thing we are doing is sometimes around elections you have people presenting themselves as somebody who they aren’t. Sometimes it’s parody, sometimes it’s very serious. The candidates will now have a stamp so that you know who they are and they really are the person they are presenting themselves to be. We see these as really important parts to delivering a better Twitter during an election.
Xanadu has released their photonics quantum computing platform, planning to double its power every six months.
Quantum computing is ‘the next big thing’ in computing, promising to usher in an all-new era. Quantum computing will fundamentally change multiple industries, including artificial intelligence, machine learning, cryptography and more.
Multiple companies are now making quantum computing available to customers. Xanadu’s approach is different than some competitors. Instead of quantum computers that must be cooled below the temperature of deep space, Xanadu’s photonic quantum processors can run at room temperature.
“We believe that photonics offers the most viable approach towards universal fault-tolerant quantum computing with Xanadu’s ability to network a large number of quantum processors together. We are excited to provide this ecosystem, a world-first for both quantum and classical photonics,” said Christian Weedbrook, Xanadu Founder and CEO. “Our architecture is new, designed to scale-up like the Internet versus traditional mainframe-like approaches to quantum computing.”
Unlike traditional computing, that revolves around binary bits with a value of either 0 or 1, quantum computing revolves around qubits. Rather than being binary, qubits can exist in both states simultaneously. The more qubits a quantum computer has, the more powerful it is. Xanadu believes they can double the power of their processors every six months.
“We believe we can roughly double the number of qubits in our cloud systems every six months,” said Weedbrook. “Future machines will also offer improved performance and new features like increased qubit connectivity, unlocking more applications for customers.”
An AI-powered “pilot” went undefeated in five rounds of simulated dogfighting with a top Air Force pilot.
Fans of the Terminator franchise are familiar with Skynet, the artificial intelligence that turned on humanity, nearly wiping it out and sending the Terminators to eliminate human targets. Critics of AI have long claimed that it represents one of the greatest existential threats humanity has ever faced.
The latest development is not likely to assuage any concerns, as an AI performed flawlessly in simulated combat against a top Air Force pilot.
“The event was the culmination of an effort that the Defense Advanced Research Projects Agency (DARPA) began last year as an adjacent project to the larger Air Combat Evolution (ACE) program, which is focused on exploring how artificial intelligence and machine learning may help automate various aspects of air-to-air combat,” writesThe Drive’s Joseph Trevithicka.
Originally, there were eight companies that were part of the competition. After flying against each other, Heron Systems came out on top and advanced to the final round where their AI beat the human pilot.
Should AI ever become a Skynet-like threat, it appears it won’t have any problem controlling the sky.
Cense AI has inadvertently leaked 2.5 million detailed medical records of auto accident victims.
Cense AI is an “SaaS platform that helps business in implementing Intelligent Process Automation, intelligent bots to automate tasks without disrupting current system.” The company specializes in “simplifying implementation of business process automation using Machine Learning and Natural Language Processing.”
According to security researcher Jeremiah Fowler, working in collaboration with Secure Thoughts, Cense AI left two folders with medical data exposed on the same IP address as the company’s website. The two folders contained a combined “2.5 million records that appeared to contain sensitive medical data and PII (Personally Identifiable Information). The records included names, insurance records, medical diagnosis notes, and much more.” In addition, there were clinics, insurance providers and accounts contained in the data.
This is a massive breach on the part of a company trusted with the most sensitive type of customer information, and serves as a cautionary example of what can happen when outside companies are given access to medical data.
What’s more, to date, there has not been any public statement, blog post or explanation on Cense’s part. In other words, this appears to be another case study in how not to handle a data breach.
Google has announced it is investing $450 million in security company ADT, in a multi-year partnership that will give Google a 6.6% stake.
The deal is a win for both companies. Google benefits from ADT’s security expertise, not to mention its 20,000 professionals, who will soon be selling and installing Nest devices and services. ADT, on the other hand, benefits from Google’s AI-driven smart home developments.
“Over time, Nest’s devices, powered by Google’s machine learning capabilities will enhance ADT’s security monitoring and become the cornerstone of ADT’s smart home offering,” writes Rishi Chandra, Vice President and GM, Nest. “The goal is to give customers fewer false alarms, more ways to receive alarm events, and better detection of potential incidents inside and around the home. It will also provide people with more helpful notifications that make everyday life more convenient, like package detection. ADT customers will also have access to Nest Aware, a service that keeps people informed about important events at home, including intelligent alerts and event history recording for up to 30 days.”
Google has repeatedly been in the news lately, with its recent Fitbit deal under intense scrutiny in the US and the EU. Regulators are concerned with how Google will use the data it acquires from the wearables maker. It’s possible this scrutiny was a motivating factor in Google investing in ADT, rather than attempting to buy it or a competing firm outright. Whatever the motivation, it’s evident Google has high hopes for what the partnership will bring.
“Together, we aim to create the next generation of the helpful home—based on new security solutions that will better protect and connect people to their homes and families,” writes Chandra.