New research shows how brain-like computers could revolutionize blockchain and AI
Tristan Greene6 hours agoNew research shows how brain-like computers could revolutionize blockchain and AIA CMOS-compatible neuromorphic computing chip could be on the horizon thanks to breakthrough research out of Technische Universität Dresden.5128 Total views6 Total sharesListen to article 0:00NewsJoin us on social networksResearchers from Technische Universität Dresden in Germany recently published breakthrough research showcasing a new material design for neuromorphic computing, a technology that could have revolutionary implications for both blockchain and AI.
Using a technique called “reservoir computing,” the team developed a method for pattern recognition that uses a vortex of magnons to perform algorithmic functions near instantaneously.Working principle of a magnon-scattering reservoir. Source: "Pattern recognition in reciprocal space with a magnon-scattering reservoir," Nature
Not only did the researchers develop and test the new reservoir material, but they also demonstrated the potential for neuromorphic computing to work on a standard CMOS chip, something that could upend both blockchain and artificial intelligence (AI).
Classical computers, such as the ones that power smartphones, laptops and the majority of the world’s supercomputers, use binary transistors that can either be on or off (expressed as either a “one” or “zero”).
Neuromorphic computers use programmable physical artificial neurons to imitate organic brain activity. Instead of processing binaries, these systems send signals across varying patterns of neurons with the added factor of time.
The reason this is important for the fields of blockchain and AI, specifically, is because neuromorphic computers are fundamentally suited for pattern recognition and machine learning algorithms.
Binary systems use boolean algebra to compute. For this reason, classical computers remain unchallenged when it comes to crunching numbers. However, when it comes to pattern recognition, especially when the data is noisy or missing information, these systems struggle.
This is why it takes a significant amount of time for classical systems to solve complex cryptography puzzles and why they’re entirely unsuited for situations where incomplete data prevents a math-based solution.
In the finance, AI and transportation sectors, for example, there’s a neverending influx of real-time data. Classical computers struggle with occluded problems — the challenge of driverless cars, for example, has so far proven difficult to reduce to a series of “true/false” compute problems.
However, neuromorphic computers are built for dealing with problems that involve a lack of information. In the transportation industry, it’s impossible for a classical computer to predict the flow of traffic because there are too many independent variables. A neuromorphic computer can constantly react to real-time data because it doesn’t process data points one at a time.
Instead, neuromorphic computers run data through pattern configurations that function somewhat like the human brain. Human brains flash specific patterns in relation to specific neural functions, and both the patterns and the functions can change over time.
Related:How does quantum computing impact the finance industry?
The main benefit of neuromorphic computing is that, relative to classical and quantum computing, its level of power consumption is extremely low. This means that neuromorphic computers could significantly reduce the cost in terms of time and energy when it comes to both operating a blockchain and mining new blocks on existing blockchains.
Neuromorphic computers could also provide significant speedup for machine learning systems, especially those that interface with real-world sensors (self-driving cars, robots) or those that process data in real-time (crypto market analysis, transportation hubs).
Collect this article as an NFTto preserve this moment in history and show your support for independent journalism in the crypto space.# Blockchain# Science# Supercomputer# AI# Machine Learning# Quantum ComputingAdd reactionAdd reactionRelated NewsHow to buy NFTs without owning cryptoHistory of Python programming languageTwitter vs. Threads: Users are the real losers5 entry-level machine learning jobsScientists create a crypto portfolio management AI trained with on-chain dataCerebras Systems secures $100M AI supercomputer deal with UAE’s G42