latest updates from easySERVICE™
SAN FRANCISCO — Today’s datacenters just aren’t good enough to meet the demand for the millions upon billions of connected devices and critical workloads piling up.
That has been the key theme, at least on an enterprise level, at the annual Intel Developer Forum.
Much like a few other big tech players in the last month, namely VMware, Intel has pinpointed software-defined networking as the answer.
Speaking at a keynote session on Wednesday morning, Intel’s datacenter group general manager Diane Bryant reflected on the “tremendous change occuring in our industry and the massive opportunity it presents for all of us.”
“That growth in devices is driven by what people can do with the devices,” Bryant posited, explaining this trajectory is fueled by what Intel has been referring to constantly this week as the digital services economy.
But to power this economy, immediate changes to the way datacenters are designed are critical.
“Today’s infrastructure is just not capable of handling this explosive growth opportunity,” Bryant lamented. She outlined that these changes mean the datacenter will have to move apps from static to dynamic, manual to automatic, and siloed to open, common architectures.
Quite simply, Bryant summed up, datacenters will be tuned to the workload.
F5 Networks chief technical officer Karl Triebes elaborated briefly the company’s partnership with Intel, boasting that F5 can take its software and consolidate functionality within the datacenter, whcih decreases the number of boxes while leveraging new capabilities.
Following up on a prominent addition to the Intel Xeon family this week, Bryant highlighted that Intel now offers 100 datacenter processors on top of new, customized architectures for ecosystem partners and end users.
With the debut of the third generation of the Xeon D 14nm energy-efficient and dense SoC, Intel has planted the next step in its ongoing datacenter platform roadmap layed out back in 2011.
Customized is the key word in all of this. Intel executives outlined the layers of options customers have for improving workload acceleration, starting at the software level (being more flexible than hardware), followed by accelerators, SoC and customer IP, and the instruction set architecture.
The processor giant is also redirecting its investments in software-defined storage, namely with the open source community hand-in-hand with OpenStack and free software storage platform CEPH.
“This is a prime example of where we can collectively innovate and unleash the demand,” Bryant remarked. Pointing toward the telecommuncations industry as an example, Bryant explained how carriers can move from proprietary, fixed-function networks onto software-defined networks. She touted the results as “compelling,” not only for operational and capital expenditure savings but also for speeding up pacing for delivering new services.
“Most of us who know this industry would have never guessed it could have moved so quickly,” Bryant reflected.
Intel has already enlisted more than 85 partners for its Network Builders Program to encourage adoption for SDN, including Oracle, HP, Citrix, Dell, and many more.
Tying these initiatives together with a number of other cost-saving objectives (i.e., silicon photonics and NFV), Intel principal engineer Das Kahmhout how demonstrated the collective orchestration layer is open to developers for optimization while also fueling better business decisions.
Kahmhout compared the datacenter orchestration layer to the human brain in that the workloads and data pumped through all of these levels result into the flow of automatic and intelligent analytics.
Bryant admitted big data is certainly a industry buzz term, but she argued that the buzz is also valid. Intel’s belief is that within the near future, data analytics will be both an explicit and implicit attribute of all services.
To meet evolving big data analytics platform requirements, Intel’s approach is to ensure the software platform is tightly coupled and tied to the hardware. Here, Bryant highlighted Intel’s commitment to the Apache Hadoop project and collaboration with Cloudera.
“The entire ecosystem could not have happened if it weren’t an open source project,” insisted Cloudera co-founder and chief strategy officer Mike Olson.
He suggested to the developers in the audience that the internet of things could be thought of as “the Internet of APIs,” meaning more and more of these environments (in the cloud and on-premises) are programmable.
Olson predicted that the opportunity to design new software in this manner “is enormous, and it’s only going to get bigger.”
Source: Associated Press