I happen to have a background at this interface as well, as the founder of DeepEarth and Ecodash.ai. I can tell you that I would greatly value such experience in collaboration, although I am not currently hiring. While having such a specific interdisciplinary niche can feel limiting, I also see it as a potential superpower in excelling in a very important domain. I'll also add that machine learning and other modeling techniques are a great bridge between classical natural sciences and modern tech today, that I would look for in collaborators. More specifically from the earth sciences, "GeoAI" would be a key focus.
The one thing I got out of the MIT OpenCourseWare AI course by Patrick Winston was that all of AI could be framed as a problem of search. Interesting to see Demis echo that here.
The 50+ filters at Ecodash.ai for 90,000 plants came from a custom RAG model on top of 800,000 raw web pages. Because LLM’s are expensive, chunking and semantic search for figuring out what to feed into the LLM for inference is a key part of the pipeline nobody talks about. I think what I did was: run all text through the cheapest OpenAI embeddings API… then, I recall that nearest neighbor vector search wasn’t enough to catch all relevant information, for a given query to be answered by an LLM. So, I remember generating a large number of diverse queries, which mean the same thing (e.g. “plant prefers full sun”, “plant thrives in direct sunlight”, “… requires at least 6 hours of light per day”, …) and then doing nearest neighbor vector search on all queries, and using the statistics to choose what to semantically feed into RAG.
Hey, thanks for unpacking what you did at ecodash.ai.
Did you manually curate the queries that you did LLM query expansion on (generating a large number of diverse queries), or did you simply use the query log?
I’m glad Ilya starts the talk with a photo of Quoc Le, who was the lead author of a 2012 paper on scaling neural nets that inspired me to go into deep learning at the time.
His comments are relatively humble and based on public prior work, but it’s clear he’s working on big things today and also has a big imagination.
I’ll also just say that at this point “the cat is out of the bag”, and probably it will be a new generation of leaders — let us all hope they are as humanitarian — who drive the future of AI.
Obviously the article is challenging the view — scientific or not — that mitochondria are not living.
Side note: previously I was funded by NSF and NASA to study such questions from biophysics and astrobiology.
That said, this was a delightful read. I did not realize or conceive of mitochondria as, like bacteria in our bodies, independent living networks with unique genomes, evolution, and flows of information and energy.
Reading about the health benefits of “external mitochondria” made me think about when I hug my dog: are we exchanging mitochondria, perhaps?
Restricted Boltzmann Machines were a huge revolution in the field, warranting a publication in Science in 2006. If you want to know what the field looks like back then, here it is: https://www.cs.toronto.edu/~hinton/absps/science.pdf
I remember in 2012 for my MS thesis on Deep Neural Networks spending several pages on Boltzmann Machines and the physics-inspired theories of Geoffrey Hinton.
My undergraduate degree was in physics.
So, yes, I think this is an absolutely stunning award. The connections between statistical entropy (inspired by thermodynamics) and also of course from biophysics of human neural networks should not be lost here.
Anyways, congratulations to Geoffrey Hinton. And also, since physics is the language of physical systems, why not expand the definition of the field to include the "physics of intelligence"?
Yeah I agree with the 2006 Hinton paper. I read it and reread it and didn't get it. I didn't have the math background at the time and it inspired me to get it. And here I am almost 20 years later working on it.
Congrats to Jayesh and team! I was lucky to meet the founding CEO recently, and happy to let everyone know he's very friendly and of course super intelligent.
As a fellow deep learning modeler of Earth systems, I can also say that what they're doing really is 100% top notch. Congrats to the team and YC.