As demand grows for more powerful and efficient microelectronics systems, industry is turning to 3D integration—stacking chips on top of each other. This vertically layered architecture could allow high-performance processors, like those used for artificial intelligence, to be packaged closely with other highly specialized chips for communication or imaging. But... Read more
Combinatorial optimization problems (COPs) arise in various fields such as shift scheduling, traffic routing, and drug development. However, they are challenging to solve using traditional computers in a practical timeframe.... Read more
What happens when trailblazing engineers and industry professionals team up? The answer may transform the future of computing efficiency for modern data centers.... Read more
As artificial intelligence (AI) continues to advance, researchers at POSTECH (Pohang University of Science and Technology) have identified a breakthrough that could make AI technologies faster and more efficient.... Read more
In a new Nature Communications study, researchers have developed an in-memory ferroelectric differentiator capable of performing calculations directly in the memory without requiring a separate processor.... Read more
A research team has introduced a new out-of-core mechanism, Capsule, for large-scale GNN training, which can achieve up to a 12.02× improvement in runtime efficiency, while using only 22.24% of the main memory, compared to SOTA out-of-core GNN systems. This work was published in the Proceedings of the ACM on... Read more
Space is a highly volatile environment. Factors like radiation, extreme temperatures, and debris make outer space a challenging environment for operating technology. In particular, radiation can have devastating effects on computer chips.... Read more
Penn Engineers have developed the first programmable chip that can train nonlinear neural networks using light—a breakthrough that could dramatically speed up AI training, reduce energy use and even pave the way for fully light-powered computers.... Read more
The emergence of AI has profoundly transformed numerous industries. Driven by deep learning technology and Big Data, AI requires significant processing power for training its models. While the existing AI infrastructure relies on graphical processing units (GPUs), the substantial processing demands and energy expenses associated with its operation remain key... Read more
Computer chips that combine the use of light and electricity are shown to increase computational performance, while reducing energy consumption, compared with conventional electronic chips. The photonic computing chips, described in two papers in Nature this week, might address the growing computing demands driven by advancing artificial intelligence technology.... Read more
A new way to temporarily store memory, Gigaflow, helps direct heavy traffic in cloud data centers caused by AI and machine learning workloads, according to a study led by University of Michigan researchers.... Read more
A smaller, lighter and more energy-efficient computer, demonstrated at the University of Michigan, could help save weight and power for autonomous drones and rovers, with implications for autonomous vehicles more broadly.... Read more
More than seven years ago, cybersecurity researchers were thoroughly rattled by the discovery of Meltdown and Spectre, two major security vulnerabilities uncovered in the microprocessors found in virtually every computer on the planet.... Read more
Korean researchers have developed a digital holography processor that converts two-dimensional (2D) videos into real-time three-dimensional (3D) holograms. This technology is expected to play a key role in the future of holography, as it enables the instantaneous transformation of ordinary 2D videos into 3D holograms.... Read more
A team of researchers from Cornell Tech has developed a new tool designed to revolutionize hardware troubleshooting, with the help of 3D phone scans.... Read more