Gyrfalcon Technology, a global developer and provider of high-performance AI Accelerators using low power, packaged in low-cost and small sized chips, has announced a new system offering for upgrading data centers – to have “very high-performance inferencing, but using extremely low energy.”
The newly launched ‘Lacelli AI Server’ is an AI Acceleration Subsystem for Edge Inferencing. It would be able to bring “extremely high performing” artificial intelligence (AI) to colocation and enterprise data center operations.
The server backplane includes 5 Arm-based intelligent switch processors and onboard DDRs within a 2U, 19-inch rack mountable chassis which includes a twin type fan for cooling. Those processors would provide efficient peer-to-peer links between the AI modules within the chassis, as well coordinating communication between the server and external clusters of servers in the same data center.
The server chassis can accommodate up to 32 AI Modules, each module containing 4 Lightspeeur 2803 accelerator chips mounted on PCIe Gen2 x4 cards paired with an Arm-based core and having a dedicated own slot and port. Each AI Module delivers 1,200 FPS while drawing only 5 Watts of power for a performance to energy usage ratio of 240 FPS/W. Fully loaded, the server would deliver an astounding 38,400 FPS, with only 1.8 mS of latency.
“With this new server product, we truly open the door for many new kinds of customers,” said Dr. Yasuo Nishiguchi, Chairman and Chief Executive Officer at Gyrfalcon Technology Japan. “Collaborating within our ecosystem we can achieve new levels of performance, and continue to deliver with low energy use for many types of datacenter opportunities.”
Edge Inferencing AI Acceleration
Whether used in the data center of a large colocation services provider, or on campus for a large or medium enterprise, the Lacelli Edge Inferencing Server can incorporate audio, visual or sensor inputs to execute artificial intelligence (AI) for a diverse range of applications:
- Object detection, classification & identification
- Visual analysis
- Natural language processing
- Voice command, recognitions & authentication
- Face detection & recognition
- Image style transfer & super resolution
- Image & video search, encoding, captioning, segmentation & enhancement
A variety of use cases would be ideal for the Lacelli Server:
- Machine vision
- Unmanned stores & warehouses
- Smart cities & buildings
- Business intelligence
- Artificial Intelligence of Things (AIoT)
The Lacelli server was developed by GTI Japan, which is already being tested by tier one operators and service providers. GTI will be introducing the server to customers worldwide in 2020 and demonstrating it live at CES2020, January 7-9, 2020. Interested parties can request a private showing in the company booth here.