> Special Relativity can be summed up in the sentence: “We live in a spacetime which is an M4 manifold with a hyperbolic Lorentz metric of signature (+−−−)”. General Relativity can be stated accordingly: “The Universe is an M4 manifold with a Riemannian metric of signature (+−−−)” which is a solution of the Einstein equation:Rμν−12Rgμν+Λgμν=χTμν.
It sounds like you get anti-gravity for free along the way to getting superluminal travel.
> ... **Not Judging** ...
> ...
> Despite it all, I hope to have achieved a truthful text that illustrates the
> **unclassifiable** person Grothendieck was.
> ...
> The quite unusual style of writing—with sentences, subsentences, and the
> excessive use of cross references—intrigued me. One of my friends, a
> professional psychoan-alyst, declared that **the author was paranoiac**.
Ignore this guy and just read Récolte & Semailles[1]
I think the point of the post is 1) it's new and 2) it's a recollection by an older mathematician. These are always valuable and would be lost if not told, despite any other sources or stories already available.
Unfortunately, no... But the french used by Grothendieck is
surprisingly simple : it should not be too much of a challenge
with Google translate, a couple of months and some courage.
it's faster than other scripting languages, allowing
you to have the rapid development of Python/MATLAB/R
while producing code that is as fast as C/Fortran"
Also if you have any need to generate plots & graphs - RIP Julia. I was excited to try Julia, since it seemed to integrate some of the best features of each MATLAB, R, and Python. I was truly disappointed to discover that Julia would take minutes to render the exact same plots I was generating in Octave almost instantly.
There's PackageCompiler[0] which allows to precompile the packages you need for work into a system image. This means avoiding precompilation in the REPL every time you start up Julia. That's if you start with the custom system image.
However, it's a community effort and is somewhat non-trivial to get up and running with. Once Julia gets better precompilation/binary packaging support, common workflows like plotting will improve dramatically.
to be fair, that typically only happens the first time you make your graph; replots are typically lightning fast. but this was a major frustration for me, I haven't had occasion to use it more recently, but I understand maybe things are a bit better for graphing in julialand in the last few months?
I had almost the opposite experience lately. I started doing some data analysis in Python/NumPy/Matplotlib because I figured it was mostly plotting and would be quicker in Python. It was excruciatingly slow. Partly due to wanting hidpi plots on my Mac. After switching to over to Julia, the new plots package has been complete enough to handle most use cases and only takes <5s to complete after warmup compared to matplotlib’s 30s+ in many cases. Nice benefit is that Julia types shortened up my data cleaning code significantly too. My favorite Julia plots combo has been gr in Jupiter notebooks.
this is not really an issue, as long as you ignore the julia plotting capabilities; which should have never been there anyway. You can easily dump your numbers (and functions) on a text file and gnuplot them.
Uhh no. I do a lot of data analysis and if I have to dump stuff into text files every time I want to visualize something quickly then I'm going to go mental.
Simple plots take a fraction of a second in Python/R/Matlab. I feel like many people don't realize how crucial this is. Sub-second plotting makes working with data interactive. If it takes more than 5 seconds to produce simple plots, that's no longer interactive. Imagine if your debugger took half a minute to show you the value of a variable while trying to find a complex bug. You'd start pulling your hair out.
If in Julia it takes me half a minute at least (dumping to text file, reading it in somewhere else and then plotting it), Julia is going to remain firmly in the "check this language again in 2 years time if the plotting story has become sensible yet".
It would be great if it were quicker. But right now, interactive use is pretty good, like 5ms for a simple plot. It's calling julia from the command line, and thus starting cold, which is more than 5s.
That is just a really terrible way to do things and I have no idea why you would excuse it like that. Typical workflow if you use the Plots.jl package is to wait half a minute for the package to load in the REPL and then stay in the same REPL for your workflow so that you won't have to reload the package.
What I do is wait one second for the RCall.jl package to load and then use R's ggplot2 library to plot. Works really well, especially since I am really familiar with ggplot2.
I do not use the REPL, I call julia scripts from elsewhere, and then I want to recover the data out of julia, and text files are perfectly appropriate (a few thousand numbers). Then julia plotting capabilities are suboptimal with respect to specialized plotting software like gnuplot. I would really prefer if julia did not have any plotting stuff.
That's not typical workflow for the majority of data scientists. And saying you prefer Julia not have any plotting stuff sounds really really dumb to be frank.
Agreed. As a data scientist myself, I can't imagine Julia getting much "mindshare" among us with the JIT experience it has. Perhaps we're not the real target audience for Julia? But if that's the case then adoption will likely be slow, and limited to only very niche applications and roles. For Julia to really become the next big thing (and solve the damn two language problem), it needs to be an effective solution for data scientists and machine learning engineers--and right now, it just isn't.
I do machine learning and computer vision in python, statistical analysis, plotting, and anything to do with dataframes in R, and computational stuff, network science, and almost everything else in julia. I would like to switch my data analysis stuff to julia but waiting for libraries and functions to load is just too frustrating when I'm doing things interactively. I'm hoping Julia will have a good machine learning, computer vision, and data science environment in the future and it is looking like it will. But for now, it is not an easy environment to work with in these applications and you'd need some fairly specific needs to justifiably use Julia here. But the thing is that when you do have relatively esoteric things to do in these applications, it is much easier to do them in Julia.
Startup feels instant to me. Perhaps you are talking about precompilation. In that case, it’s insignificant for most computationally intensive applications.
I don't know the difference between startup and precompilation, and I do not really care, but if I launch a julia script from the command line it is unbearably slow for no apparent reason. Octave, on the other hand, launches instantly and starts making computations.
This is understandable, because julia is not intended to be used that way. You are supposed to "live" inside the repl. However, I prefer tools that are flexible enough that can be used comfortably in non-intended ways.
Perhaps I don’t. Science is objective, it is not a consensus of opinions. I sincerely apologize for the criticism here, it wasn’t a good thing to do. I wish I could delete these comments, but I can’t. I can say sorry. though.
So much truth in here... Specially for the "just funded" startups : they loose their souls fast when the investors start to get impatient. It's a trap in so many cases.
"“I understood nothing, but it was really fascinating,” he said. So Scholze worked backward, figuring out what he needed to learn to make sense of the proof. “To this day, that’s to a large extent how I learn,” he said. “I never really learned the basic things like linear algebra, actually — I only assimilated it through learning some other stuff.”"
I've long wanted a series of interactive math ebooks that work that way. Each would take one interesting theorem, such as the prime number theorem, and work backward.
When you start the book, it would give the theorem and proof at a level that would be used in a research journal. For each step of the proof, you would have two options for getting more detail.
The first option would be at the same level, but less terse. E.g., if the proof said something like "A implies B", asking for more detail might change that to "A implies B by the Soandso theorem". Asking for more detail there might elaborate on how you use the Soandso theorem with A".
The second expansion options gives you the background to understand what is going on. In the above example, doing this kind of expansion on the Soandso theorem would explain that theorem and how to prove it.
Both types of expansion can be applied to the results of either type of expansion. In particular, you can use the second type to go all the way down to high school mathematics.
If you started with just high school math, and used one of these books, you would get the basics...but only those parts of the basics you need to understand the starting theorem.
Pick a different starting theorem, and you get a different subset of the basics. It should be possible to pick a set of theorems to treat this way that together end up covering most of the basics.
That might be a more engaging way to teach mathematics, because you are always working directly toward some interesting theorem.
Yes, you and absolutely everyone else in the world that loves math, didn't have time to get a phd and isn't elitist wants this.
Sadly, the monetization of this is tricky. Probably has to be an open source effort. Need some visionary like wales or khan, but they are very very rare.
It's a great idea and I think it's much bigger than maths.
If you do not already know about it, searching around what
a "Dynabook" is cannot be a waste of time.
Yeah. Reading the post I see a guy overwhelmed by a bunch of equations and numbers. Which isn't to say he shouldn't learn them, but math is always far more intimidating when you don't understand it than other subjects.
There is a point where one starts to see "behind" the symbols. It's a strange sensation, as if one could understand the ideas in a non-verbal way. The symbols become optional. Intimidation crawls back before curiosity at this point.
An amazing book on the subject is:
"Hadamard - The psychology of invention in the mathematical field"
What took me a long time -- and is still a skill I'm developing -- is to both verify and "read" the math at the same time, to see the proof and the story at the same time.
At one level, you're observing a technical construction and trying to ensure that it's (mostly) sound; but at another level, you're trying to understand the broader picture of how it fits in, what the builder was trying to accomplish or what perspective of the world they're trying to share.
Mathematics is -- like any language -- just the articulation of an experience, of an insight, of an understanding. As you get further into mathematics (and possess more technical skills of your own), it becomes more important to see "Oh, he's trying to apply the machinery of homotopy to type theories as a means of discussing equivalence" than it is to get bogged down in the technical details. Often, the details are wrong in the first draft, but in a fixable way. (This is extremely common in major proofs.)
> There is a point where one starts to see "behind" the symbols. It's a strange sensation, as if one could understand the ideas in a non-verbal way
I think at some point, you have to compile mathematics to non-verbal ideas for computational reasons -- your verbal processing skills are simply too slow and too simple compared to other systems. Your visual and motor systems are way more powerful and (in the case of motor systems) operate in high dimensions. Much like GPUs in computers, if you can find a representation of a problem that works on a specialized system, you can often get a big computational boost; in mathematics, we have to push our understanding of self and experience to the limits to find more efficient representations of ideas, so we can operate on more interesting or complex ones.
I think most mathematicians work in extremely personal, non-portable internal representations, and then use the symbols as a way to create an external representation that the other mathematicians can compile into their own internal representations.
If you see mathematics as extremely high level code meant to be compiled to equivalent internal representations on thousands of slightly different compilers, I think the language starts to make more sense -- it's meant to be a reverse compilation target for machine code that's been under revision for ~3000 years, so of course it looks a little funky.
Ed:
I will say this --
One thing I've noticed as I've gotten older is that we do a really poor job of teaching students the story of mathematics -- the human motivations, the community, the long standing projects (some have gone on for hundreds of years; some are still ongoing).
I sincerely believe that for young kids (less than, say 10), it would be better for their development to teach skills 4 days a week and simply tell them part of the story on the 5th. It would make mathematics much more relatable and understandable.
I liked but didn't love mathematics in high school and as such I just did what I had to do and moved on. A decade later I worked through a CS degree and gravitated towards books about mathematicians and now I have a deep fascination with mathematics and I wish I read these books when I was in high school!
A survey of how mathematicians think about mathematics [citation needed] found 80% visually, 15% kinesthetically, and 5% symbolically (i.e. in terms of notation).
> math is always far more intimidating when you don't understand it than other subjects.
In a way it is like a magic trick. Frustrating when you don't know how it works, but when you find out it's like: oh was that all there's to it? However, unlike a magic trick, math leaves you with something that can be actually useful.
Hmm that's very interesting. I just don't understand how he made it through university. When I was enrolled in CS I somewhat got along with Algebra and was completely lost when it came to Analysis and so I dropped out. Back then I was working so hard at my courses I felt that I simply had no time to even consider "other stuff". I would like to know how it was obvious to him what he had to do.
Cosmology is so cool... too bad we do not have time for that: we cannot even cure the common cold!
[1]: https://januscosmologicalmodel.com/pdf/2014-ModPhysLettA.pdf Cosmological bimetric model with interacting positive andnegative masses and two different speeds of light,in agreement with the observed acceleration of the Universe