Scaling laws in AI will eventually get grounded in a physical KPI, which will characterize the model performance based on the performance of the computing system it is deployed on.
Scaling laws in AI will eventually get grounded in a physical KPI, which will characterize the model performance based on the performance of the computing system it is deployed on.
As AI models and specialized computers (including quantum computers) become the playground for scientific discoveries, there will be a need for a new class of verifiers grounded in scientific experiments.
One could consider the second class of technologies fundamentally more novel as it might be humanity that brought them into existence for the first time in the history of the universe.
If we consider the universe ultimately as an information processor, then we can define two classes of technologies: 1) systems that have been realized by nature (e.g. neural networks, error correction). 2) systems that have not been realized by nature (e.g. quantum computers, particle accelerators).
I had the opportunity to give a talk at Stanford University on my PhD research, titled βInformation technologies at the fundamental physical limits.β I would like to sincerely thank Professor Jelena VuΔkoviΔ and her group for hosting me.
Slides from my talk are available at volkangurses.com/talks
Congrats Kyle!
Congratulations to John Clarke, Michel Devoret and John Martinis! Quantum mechanical description of a circuit will continue to transform electronics in the 21st century.
Plenty of people are adopting a cynical mindset insofar as they are shunning entire research directions that might be as promising as βmainstreamβ quantum platforms like superconducting qubits. I agree hype is problematic but we should battle hype through scientific discourse rather than belittling.
I am talking about a time scale on the order of a century. A century ago, we didnβt even have the transistor. If everyone adopts a cynical mindset, nobody will care enough to make any progress.
There is a constant battle between deep tech startups and academia. One needs hype for marketing, the other despises hype for credibility.
When the future seems blurry, reading about similar periods in history clarifies a lot.
A lot of my scientist friends are going to Europe. I think it will be a while until US loses its edge in science, but if the current policies are sustained, research in US will definitely take a significant hit.
Arguably, the hardest thing about original research is convincing others that it is important.
Humanity being able to create artificial intelligence isnβt that surprising (not that it diminishes its significance) since we had nature to copy from. What would be very surprising is to make quantum technologies useful given virtually no organism uses quantum mechanics deliberately in nature.
Optimizing for originality is better than optimizing for impact.
If you can bring the quantum community here, science posts will exponentially increase.
Happy International Women's Day!
The commoditization of foundational models have many parallels to the early semiconductors. Similar to how Moore's law led to hardware whose operations could be abstracted away by software, foundational models could lead to higher level operating systems built on top of them.
A breakthrough technology happens when it is so fundamentally new that marketing and branding becomes secondary concerns.
An elephant in the room in technology is the lack of efforts to make analog computing work at scale. This has relevance to both neural networks (our brains are analog) and quantum computing (a lot of the error correction challenges in quantum computing are similar to analog computing).
Million-qubit-scale is still a stretch but focusing on high-yield fabrication is the correct call. If the process is made available publicly, it will have far-reaching impact on the photonics and quantum community.
Bringing the adaptability of software into hardware with the same cost overhead would be world changing.
Fundamental breakthroughs in technology are usually in one of two categories: 1) Technologies that imitate or replace humans (robotics, AI, computer vision) 2) Technologies that imitate or replace nature (semiconductors, quantum, gene editing).
I would say learn the fundamentals of how these technologies work rather than just apply them. Being able to call the facade when tech gets overhyped is immensely valuable and allows you to enter and exit tech waves at the right time.
Every technologist should learn and build intuition about quantum mechanics. It provides insight into the fundamental limits of any technology.
Similar to how video games led to many world-changing moments (GPUs, parallel computing, growing interest in tech), VR games will also eventually change the world, most likely in unexpected ways.
If plasmonic circuits can be fabricated at scale and with high yield, it will revolutionize the semiconductor industry. They can be the best of both worlds of electronics and photonics.
In these uncertain times, let science be your north star.
Hardware leads to better software leads to better hardware leads to better software⦠(until we reach the fundamental physical limits).
A combination of science and capital can change the world but neither can do it alone.