April 27, 2021 By Levi Norman 3 min read

As more companies turn to hybrid clouds to fuel their digital transformation, the need to ensure data accessibility across the enterprise—from the data center to the edge—grows.

Often geographically dispersed and disconnected from the data center, edge computing can strand vast amounts of data that could be otherwise brought to bear on analytics and AI. According to a recent report from IDC, the number of new operational processes deployed on edge infrastructure will grow from less than 20% to over 90% in 2024[1] as digital engineering accelerates IT/OT convergence.

IBM is taking aim at this challenge with several innovative storage products. Announced today, IBM Spectrum® Fusion is a container-native software defined storage (SDS) solution that fuses IBM’s trusted general parallel file system technology (IBM Spectrum® Scale) and its leading data protection software (IBM Spectrum® Protect Plus). This integrated product simplifies data access and availability from the data center to the edge of the network and across public cloud environments.

In addition, we announced the new IBM Elastic Storage® System 3200, an all-flash controller storage system, equipped with IBM Spectrum Scale. The new 2U model offers 100% more performance than its predecessor and up to 367TB of capacity per node[2].

We are committed to helping customers propel their transformations by providing solutions that make it easier to discover, access and manage data across their increasingly complex hybrid cloud environments. Today’s announcements are testaments to this strategy.

IBM Spectrum Fusion

IBM Spectrum Fusion is a hybrid cloud container-native data solution for Red Hat® OpenShift® and Red Hat OpenShift Data Foundation (formerly known as Red Hat OpenShift Container Storage). It “fuses” the storage platform with storage services and is built on the market-leading technology of IBM Spectrum Scale with advanced file management and global data access.

IBM Spectrum Fusion will be offered in two iterations: a hyperconverged infrastructure (HCI) system, due in the second half of 2021, and an SDS software solution, due in 2022.

The HCI edition will be the industry’s first container-centric, hyperconverged system. Although competitive HCI systems support containers, most are VM-centric. IBM Spectrum Fusion will come out of the box, built for and with containers, running on Red Hat OpenShift. Characteristics of IBM Spectrum Fusion HCI include:

  • Integrated HCI appliance for both containers and VMs using Red Hat OpenShift
  • Global data access with active file management (AFM)
  • Data resilience for local and remote backup and recovery
  • Simple installation and maintenance of hardware and software
  • Global data platform stretching from public clouds to on-premises or edge locations
  • IBM Cloud® Satellite and Red Hat ACM integration
  • Starts small with 6 servers and scales up to 20 (with NVIDIA HPC GPU enhanced options)
“IBM Spectrum Fusion HCI will provide our customers with a powerful container-native storage foundation and enterprise-class data storage services for hybrid cloud and container deployments,” said Bob Elliott, Vice President Storage Sales, Mainline Information Systems. “In today’s world, our customers want to leverage their data from edge to core data center to cloud and with IBM Spectrum Fusion HCI our customers will be able to do this seamlessly and easily.”

In 2022, IBM Spectrum Fusion will also be available as a stand-alone software-defined storage solution.[3]

Next-generation storage built for high-performance AI and hybrid cloud

IBM is also introducing the highest-performing scale-out file system node ever released for IBM Spectrum Scale. Including advanced file management and global data access, the new Elastic Storage System (ESS) 3200 is a two-rack unit enclosure with all-NVMe flash that delivers 80GB/s throughput, 100% faster than the previous ESS 3000 model.[4]

“IBM’s newest member for enterprise-class storage offerings, the ESS 3200 with IBM Spectrum Scale, provides a faster, reliable data platform for HPC, AI/ML workloads enabling my clients to expedite time to results.” – John Zawistowski, Global Systems Solution Executive, Sycomp

This solution is designed to be easy to deploy and can start at 48TB configurations and scale up to 8YB (yottabytes) of global capacity in a single global name space seamlessly spanning edge, core data center and hybrid cloud environments. With options for 100Gbps ethernet or 200Gbps InfiniBand, this system is designed for the most demanding high-performance enterprise, analytics, big data and AI workloads.

Read the press release here.

For more information on IBM Spectrum Fusion and IBM Spectrum Fusion HCI please visit our webpage.

For more information on IBM ESS 3200 please visit: https://www.ibm.com/products/elastic-storage-system.

And learn more about today’s additional announcements for IBM’s Data Resilience Portfolio here

Was this article helpful?
YesNo

More from Cloud

IBM Tech Now: April 8, 2024

< 1 min read - ​Welcome IBM Tech Now, our video web series featuring the latest and greatest news and announcements in the world of technology. Make sure you subscribe to our YouTube channel to be notified every time a new IBM Tech Now video is published. IBM Tech Now: Episode 96 On this episode, we're covering the following topics: IBM Cloud Logs A collaboration with IBM watsonx.ai and Anaconda IBM offerings in the G2 Spring Reports Stay plugged in You can check out the…

The advantages and disadvantages of private cloud 

6 min read - The popularity of private cloud is growing, primarily driven by the need for greater data security. Across industries like education, retail and government, organizations are choosing private cloud settings to conduct business use cases involving workloads with sensitive information and to comply with data privacy and compliance needs. In a report from Technavio (link resides outside ibm.com), the private cloud services market size is estimated to grow at a CAGR of 26.71% between 2023 and 2028, and it is forecast to increase by…

Optimize observability with IBM Cloud Logs to help improve infrastructure and app performance

5 min read - There is a dilemma facing infrastructure and app performance—as workloads generate an expanding amount of observability data, it puts increased pressure on collection tool abilities to process it all. The resulting data stress becomes expensive to manage and makes it harder to obtain actionable insights from the data itself, making it harder to have fast, effective, and cost-efficient performance management. A recent IDC study found that 57% of large enterprises are either collecting too much or too little observability data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters