Recent Tests Demonstrate 30 to 250 Percent Higher HPC Applications Performance and up to 50 Percent Savings on Capital and Operation Expenses vs. Proprietary Interconnect Offerings

Mellanox® Technologies, Ltd. (NASDAQ: MLNX), a leading supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, today announced that EDR 100Gb/s InfiniBand solutions have demonstrated from 30 to 250 percent higher HPC applications performance versus Omni-Path. These performance tests were conducted at end-user installations and Mellanox benchmarking and research center, and covered a variety of HPC application segments including automotive, climate research, chemistry, bioscience, genomics and more.

Examples of extensively used mainstream HPC applications:

  • GROMACS is a molecular dynamics package design for simulations of proteins, lipids and nucleic acids and is one of the fastest and broadly used applications for chemical simulations. GROMACS has demonstrated a 140 percent performance advantage on an InfiniBand-enabled 64-node cluster.
  • NAMD is highly noted for its parallel efficiency and is used to simulate large biomolecular systems and plays an important role in modern molecular biology. Using InfiniBand, the NAMD application has demonstrated a 250 percent performance advantage on a 128-node cluster.
  • LS-DYNA is an advanced multi-physics simulation software package used across automotive, aerospace, manufacturing and bioengineering industries. Using InfiniBand interconnect, the LS-DYNA application has demonstrated a 110 percent performance advantage running on a 32-node cluster.

Due to its scalability and offload technology advantages, InfiniBand has demonstrated higher performance utilizing just 50 percent of the needed data center infrastructure and thereby enabling the industry’s lowest Total Cost of Ownership (TCO) for these applications and HPC segments. For the GROMACS application example, a 64-node InfiniBand cluster delivers 33 percent higher performance in comparison to a 128-node Omni-Path cluster; for the NAMD application, a 32-node InfiniBand cluster delivers 55 percent higher performance in comparison to a 64-node Omni-Path cluster; and for the LS-DYNA application, a 16-node InfiniBand cluster delivers 75 percent higher performance than a 32 node Omni-Path cluster.

“InfiniBand solutions enable users to maximize their data center performance and efficiency versus proprietary competitive products. EDR InfiniBand enables users to achieve 2.5X higher performance while reducing their capital and operational costs by 50 percent,” said Gilad Shainer, vice president of marketing at Mellanox Technologies. “As a standard and intelligent interconnect, InfiniBand guarantees both backward and forward compatibility, and delivers optimized data center performance to users for any compute elements – whether they include CPUs by Intel, IBM, AMD or ARM, or GPUs or FPGAs. Utilizing the InfiniBand interconnect, companies can gain a competitive advantage, reducing their product design time while saving on their needed data center infrastructure.”

The application testing was conducted utilizing end-user data centers and the Mellanox benchmarking and research center. The full report of testing conducted at end-user data centers and the Mellanox benchmarking and research center will be available on the Mellanox web site. For more information please contact Mellanox Technologies.

Supporting Resources:

  • Learn more about: Mellanox HPC Solutions
  • Learn more about: Mellanox Center of Excellence program
  • Follow Mellanox on Twitter, Facebook, Google+, LinkedIn, and YouTube
  • Join the Mellanox Community

About Mellanox

Mellanox Technologies (NASDAQ: MLNX) is a leading supplier of end-to-end Ethernet and InfiniBand intelligent interconnect solutions and services for servers, storage, and hyper-converged infrastructure. Mellanox intelligent interconnect solutions increase data center efficiency by providing the highest throughput and lowest latency, delivering data faster to applications and unlocking system performance. Mellanox offers a choice of high performance solutions: network and multicore processors, network adapters, switches, cables, software and silicon, that accelerate application runtime and maximize business results for a wide range of markets including high performance computing, enterprise data centers, Web 2.0, cloud, storage, network security, telecom and financial services. More information is available at: www.mellanox.com.

Disclaimer:

Testing was done using what is believed to be the most current software and hardware configuration available in the test infrastructures. Results may vary depending upon the particular testing environment.

Note: Mellanox and ConnectX are registered trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.

Mellanox Technologies, Ltd.Press/Media ContactMcGrath/Power Public Relations and CommunicationsAllyson Scott, +1-408-727-0351allysonscott@mcgrathpower.comorIsrael PR ContactGalai Communications Public RelationsJonathan Wolf, +972 (0) 3-613-52-48yoni@galaipr.com

Mellanox Technologies (NASDAQ:MLNX)
Historical Stock Chart
From Mar 2024 to Apr 2024 Click Here for more Mellanox Technologies Charts.
Mellanox Technologies (NASDAQ:MLNX)
Historical Stock Chart
From Apr 2023 to Apr 2024 Click Here for more Mellanox Technologies Charts.