Showing 20 articles starting at article 461
< Previous 20 articles Next 20 articles >
Categories: Computer Science: General, Offbeat: Computers and Math
Published How to build greener data centers? Scientists say crank up the heat



Colder is not always better for energy-hungry data centers, especially when it comes to their power bills. A new analysis says that keeping the centers at 41°C, or around 105°F, could save up to 56% in cooling costs worldwide. The study proposes new temperature guidelines that may help develop and manage more efficient data centers and IT servers in the future.
Published Physicists demonstrate powerful physics phenomenon



In a new breakthrough, researchers have used a novel technique to confirm a previously undetected physics phenomenon that could be used to improve data storage in the next generation of computer devices.
Published Self-correcting quantum computers within reach?



Quantum computers promise to reach speeds and efficiencies impossible for even the fastest supercomputers of today. Yet the technology hasn't seen much scale-up and commercialization largely due to its inability to self-correct. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way. Now, a new paper illustrates a quantum computing platform's potential to solve the longstanding problem known as quantum error correction.
Published Exploring parameter shift for quantum fisher information



Scientists have developed a technique called 'Time-dependent Stochastic Parameter Shift' in the realm of quantum computing and quantum machine learning. This breakthrough method revolutionizes the estimation of gradients or derivatives of functions, a crucial step in many computational tasks.
Published New easy-to-use optical chip can self-configure to perform various functions



Researchers have developed an easy-to-use optical chip that can configure itself to achieve various functions. The positive real-valued matrix computation they have achieved gives the chip the potential to be used in applications requiring optical neural networks.
Published A new way to erase quantum computer errors



Researchers have demonstrated a type of quantum eraser. The physicists show that they can pinpoint and correct for mistakes in quantum computing systems known as 'erasure' errors.
Published Powering AI could use as much electricity as a small country



Artificial intelligence (AI) comes with promises of helping coders code faster, drivers drive safer, and making daily tasks less time-consuming. But a recent study demonstrates that the tool, when adopted widely, could have a large energy footprint, which in the future may exceed the power demands of some countries.
Published Researchers create a neural network for genomics -- one that explains how it achieves accurate predictions



A team of computer scientists has created a neural network that can explain how it reaches its predictions. The work reveals what accounts for the functionality of neural networks--the engines that drive artificial intelligence and machine learning--thereby illuminating a process that has largely been concealed from users.
Published New technology could reduce lag, improve reliability of online gaming, meetings



Whether you’re battling foes in a virtual arena or collaborating with colleagues across the globe, lag-induced disruptions can be a major hindrance to seamless communication and immersive experiences. That’s why researchers have developed new technology to make data transfer over optical fiber communication faster and more efficient.
Published Birders and AI push bird conservation to the next level



Big data and artificial intelligence (AI) are being used to model hidden patterns in nature, not just for one bird species, but for entire ecological communities across continents. And the models follow each species’ full annual life cycle, from breeding to fall migration to non-breeding grounds, and back north again during spring migration.
Published Could future AI crave a favorite food?



Can artificial intelligence (AI) get hungry? Develop a taste for certain foods? Not yet, but a team of researchers is developing a novel electronic tongue that mimics how taste influences what we eat based on both needs and wants, providing a possible blueprint for AI that processes information more like a human being.
Published AI drones to help farmers optimize vegetable yields



For reasons of food security and economic incentive, farmers continuously seek to maximize their marketable crop yields. As plants grow inconsistently, at the time of harvesting, there will inevitably be variations in quality and size of individual crops. Finding the optimal time to harvest is therefore a priority for farmers. A new approach making heavy use of drones and artificial intelligence demonstrably improves this estimation by carefully and accurately analyzing individual crops to assess their likely growth characteristics.
Published Instant evolution: AI designs new robot from scratch in seconds



Researchers developed the first AI to date that can intelligently design robots from scratch by compressing billions of years of evolution into mere seconds. It's not only fast but also runs on a lightweight computer and designs wholly novel structures from scratch — without human-labeled, bias-filled datasets.
Published Engineering study employs deep learning to explain extreme events



At the core of uncovering extreme events such as floods is the physics of fluids – specifically turbulent flows. Researchers leveraged a computer-vision deep learning technique and adapted it for nonlinear analysis of extreme events in wall-bounded turbulent flows, which are pervasive in numerous physics and engineering applications and impact wind and hydrokinetic energy, among others. Results show the technique employed can be invaluable for accurately identifying the sources of extreme events in a completely data-driven manner.
Published Powering the quantum revolution: Quantum engines on the horizon



Scientists unveil exciting possibilities for the development of highly efficient quantum devices.
Published One-hour training is all you need to control a third robotic arm, study finds


A new study has found that people can learn to use supernumerary robotic arms as effectively as working with a partner in just one hour of training.
Published New qubit circuit enables quantum operations with higher accuracy


Researchers have developed a novel superconducting qubit architecture that can perform operations between qubits with much higher accuracy than scientists have yet been able to achieve. This architecture, which utilizes a relatively new type of superconducting qubit called fluxonium, is scalable and could be used to someday build a large-scale quantum computer.
Published Drug discovery on an unprecedented scale


Boosting virtual screening with machine learning allowed for a 10-fold time reduction in the processing of 1.56 billion drug-like molecules. Researchers teamed up with industry and supercomputers to carry out one of the world's largest virtual drug screens.
Published Efficient training for artificial intelligence


New physics-based self-learning machines could replace the current artificial neural networks and save energy.
Published Scientists successfully maneuver robot through living lung tissue


Scientists have shown that their steerable lung robot can autonomously maneuver the intricacies of the lung, while avoiding important lung structures.