Mellanox MHQH29-XTC ConnectX IB Infiniband Host Bus Adapter

MPN: MHQH29-XTC
Out of Stock
Highlights
Contact sales for pricing
Please log in to add an item to your wishlist.
B2B pricing options available.

SabrePC B2B Account Services

Save instantly and shop with assurance knowing that you have a dedicated account team a phone call or email away to help answer any of your questions with a B2B account.

  • Business-Only Pricing
  • Personalized Quotes
  • Fast Delivery
  • Products and Support
Need Help? Let's talk about it.
Please log in to add an item to your wishlist.
Mellanox MHQH29-XTC ConnectX IB Infiniband Host Bus Adapter
MPN: MHQH29-XTC
Contact sales for pricing

Mellanox MHQH29-XTC ConnectX IB Infiniband Host Bus Adapter

Out of Stock
Highlights

ConnectX adapter cards provide the highest performing and most flexible interconnect solution for Enterprise Data Centers, High-Performance Computing, and Embedded environments. Clustered data bases, parallelized applications, transactional services and high-performance embedded I/O applications will achieve significant performance improvements resulting in reduced completion time and lower cost per operation.

ConnectX based adapter cards simplifies network deployment by consolidating cables and enhancing performance in virtualized server environments.

World-Class Performance Over InfiniBand

ConnectX delivers low-latency and high-bandwidth for performance-driven server and storage clustering applications. These applications will benefit from the reliable transport connections and advanced multicast support offered by ConnectX. Network protocol processing and data movement overhead such as InfiniBand RDMA and Send/Receive semantics are completed in the adapter without CPU intervention. Servers supporting PCI Express 2.0 with 5GT/s will be able to take advantage of 40Gb/s InfiniBand, balancing the I/O requirement of these high-end servers.

TCP/UDP/IP Acceleration

Applications utilizing TCP/UDP/IP transport can achieve industry-leading throughput over InfiniBand. The hardware-based stateless offload engines in ConnectX reduce the CPU overhead of IP packet transport, freeing more processor cycles to work on the application.

I/O Virtualization

ConnectX support for hardware-based I/O virtualization provides dedicated adapter resources and guaranteed isolation and protection for virtual machines (VM) within the server. I/O virtualization with ConnectX gives data center managers better server utilization and LAN and SAN unification while reducing cost, power, and cable complexity.

Storage Accelerated

A consolidated compute and storage network achieves significant cost-performance advantages over multi-fabric networks. Standard block and file access protocols leveraging InfiniBand RDMA result in high-performance storage access. Fibre Channel frame encapsulation (FCoIB or FCoE) and hardware offloads enable simple connectivity to Fibre Channel SANs.

Software Support

All Mellanox adapter cards are compatible with TCP/IP and OpenFabrics-based RDMA protocols and software. They are also compatible with InfiniBand and cluster management software available from OEMs. The adapter cards are compatible with major operating system distributions.