Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's really amazing to see how something is finally starting to supplant ancient Fortran code in sci-comp.

I think one of the major reasons for Python being that choice is that its syntax and "Python way of doing things" is really geared at making it difficult to obfuscate the code into meaninglessness. Clarity is always important in research.



sorta kinda, but not really, python isn't supplanting fortran, it's supplanting Matlab/IDL for prototyping and visualization, and half-baked command languages for interfacing with packages of data reduction routines. Which is still nice, especially things like (to take astronomy) AIPS/IRAF, specialized packages of routines that are controlled by the worst command language possible.


Also note that the code at the core of packages like numpy is often still written in Fortran or C. For example, I'm pretty sure all the linear algebra processing in numpy is handled by the popular LAPACK written in Fortran. Note that software like Matlab and IDL also generally uses these well-understood packages.

Fortran isn't going anywhere, we're just hiding it in the background.


Everyone uses LAPACK. If you came along with your own versions of those routines, no-one would would trust them. LAPACK et al are the most optimized, most debugged code on the planet. Hardware vendors design around LAPACK. It's incredibly important.

Also check out Sage: http://www.sagemath.org/


Yep. The Intel F9x compiler smokes everything - at least in part because its versions of LAPACK/BLAS/ATLAS are so aggressively optimized.


Also, f2py (distributed with numpy) makes it easy to wrap fortran subroutines in python.


i mostly see matlab for algorithm development, which is like super short snippets. O(n2) swegr complexity creep isn't that big a deal when n is a page or two of code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: