Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be fair, there's also Python+NumPy and R in that space, not just Matlab. Besides the "tinker with LLVM" thing, what does Julia offer that Python (or Cython for speed)+NumPy does not?


Writing fast code in Julia requires less effort than it does in Python or R. You don't have to drop down to Cython or Rcpp to get good performance. If you write an algorithm in Julia the same way you'd write it in C, it will achieve equal performance. If you write it the same way you'd write it in Python or R, it may not be optimal due to the cost of memory allocation, but it's still faster than Python or R.

Julia is more concise than Python. The language was designed to work with multidimensional arrays; it wasn't bolted on afterwards. There is no difference between a 2D array and a matrix; * always does matrix multiplication and .* always does element-wise multiplication. There is no awkwardness involving differences between NumPy arrays and Python lists. Everything has a type, and you can make a multidimensional array that efficiently stores values of any type. You can define your own types and define how arithmetic operators act on them with a minimal amount of code.

Julia's type system makes writing efficient algorithms easy without sacrificing any performance. If you define your own immutable type and define your own operators on it, Julia can make those operators run as fast as they would on ordinary values. In addition to general matrices, we have diagonal, symmetric, and tridiagonal matrices. The same routines that work on general matrices work on these as well with the same syntax, just more efficiently.

Julia uses multiple dispatch instead of traditional class-based OO. Methods are not a part of the object; instead, they operate on the object. Different methods with the same name can be defined to work on different types, or a single method can operate on a set of types, but the functions it calls may be implemented differently for each of these types. This is a better fit for technical applications, where the data doesn't change much but the methods do.

Julia is homoiconic, which is more useful than this article makes it seem :). It's easy to write code that writes code. If built-in language features aren't enough to get good performance with concise syntax, you can write a hygenic macro that does this for you.


To the rescue of numpy: A matrix from linear algebra and a 2D array are not exactly the same. In Python they are different convertible types and I think in practice it is hardly a drawback. That the multiplication operation is overloaded in the Mathematical World with the same symbol as the "normal" multiplication is unfortunate, numpys solution is as good as introducing two different operators (with one being an awkward .*)

I love the multiple dispatch part about julia though.

My fear however is, that unlike Python, Julia will lack enough libraries, especially on the unscientific part (GUI, databases, network, all the other stuff you need).


> That the multiplication operation is overloaded in the Mathematical World with the same symbol as the "normal" multiplication is unfortunate, numpy's solution is as good as introducing two different operators

The thing is that the multiplication operation for matrices is matrix multiplication, not elementwise multiplication. When you apply a polynomial like x^2 + y to matrices, you do not want to apply the polynomial elementwise – you want to square the x matrix and add the y matrix to it.


I cheered out loud when I first started tinkering with Julia, tested whether multiplication did the right thing with matrices and vectors, and saw that they did. Multiplying two row vectors should give an error! Element-by-element operations should get their own operator, not the other way around.

But as a long-time R user, I'm hesitant to bet the farm on Julia for a project at work where it would be ideally suited. Maybe there's a way to squeeze it in on the side.


Sometimes an array of numbers is just an array of numbers. The language shouldn't presume too much about what you mean to do with them.

The moment you need an extra dimension (or anything other than 2 really) Matlab's ‘everything is a matrix’ approach falls apart. Matlab is a toy language in so many ways and this is just another one.

It's a pity that Julia adopted Matlab's pop matrix semantics instead of some solid and general principles from APL or J. Even modern Fortran would have been a better model for an array DSL. From what I've read of the Julia docs, they actually want you to write loops. But Julia looks great otherwise. With macros and a good compiler, maybe the array features can be fixed at some point.


how exactly does matlab's approach fai? I haven't had that much experience with it, but i do vaguely remember that it supports n-dimensions


Yes, adverbs, ranked matrix operator and some other parts of the APL/J approach I miss a bit and find them tedious to emulate/avoid.


If you are doing linear algebra, I agree. Yet linear algebra is not the only thing I want to do with numbers. I think most of the time I do use the elementwise operation, such as:

    x = linspace(0,10, 1000)
    y = (x<5)*4.0
    z = x**2 + y
Of course I can just use matrices and then I have the information at hand that I am doing linear algebra right now:

    x = matrix([[3, 0],
                [9, 5]])

    Out[28]: 
    matrix([[ 9,  0],
            [72, 25]])

    x**2 + x
I think this is not too much boilerplate and gives nice semantic information within the sourcecode. However, if you want to perform this with a 2d array object, you can also use its dot method:

    In [ 1]: a
    Out[ 1]: 
    array([[ 9,  0],
           [72, 25]])

    In [ 2]: a.dot(a)
    Out[ 2]: 
    array([[  81,    0],
           [2448,  625]])

    In [ 3]: a * a
    Out[ 3]: 
    array([[  81,    0],
           [5184,  625]])
The approach of the matrix object nicely takes into account that operands of a "normal" multiplication `*` commute, so the elementwise multiplication fits the picture here. Wheres matrix multiplication - which is non-associative for most matrices - is performed by a different method.


If multiply operator weren't such a big issue, the creators of Numpy wouldn't have attempted to insert a new operator for matrix multiplication into Python. For a dynamic language such as Python operator overloading for such closely related types is a big trouble. If I write a function that uses multiplication, either I use member methods such as "dot" or check for type explicitly, otherwise there is no guarantee what would happen. The worst part is that errors are strictly logic and only way to debug is to trace from end result all the way up to the point of object creation; it isn't pretty.


This isn't intrinsic to Matrices vs Number-Arrays. The matrix-mulitplication issue is just a mathy version of the plus-as-sting-concat troubles ("Foo: " + 1 + 1 makes "Foo: 11" while 1 + 1 + " Foo" makes "2 Foo"). There are always holy wars about whether "+" should be string-concat because of that.

Both approaches have their merits.


I can rarely think of functions

    foo(arg)
where I would like to pass either strings or numbers that uses an operator + which polymorphically concats or performs addition. In this respect like languages that offer special string concatenating operators (like Haskell ++ or Lua with ..).

More generally: I think that + and * should always commute for the applied types and mixing them should follow the rules of associativity.


I meant non-commutative


> Julia will lack enough libraries, especially on the unscientific part (GUI, databases, network, all the other stuff you need).

Yes, but this is rapidly improving. Julia has Gtk bindings that have seen a lot of improvement over the past two months. There are ODBC and SQLite interfaces, a MySQL interface is in progress, and probably others.

Julia has the advantage that you can write fast bindings in pure Julia, which alleviates the extra cognitive and tooling overhead of writing extensions in C.

Building a language ecosystem is a bit of a ponzi scheme - but it has real potential for a great payoff at the end!


It is pretty standard for elementwise operations to use .

Matlab you have .* ./ .^ and probably more and for good reason.

I don't really see it as being awkward it actually very useful when needed, can be confusing if you're learning a language and think .* might be dot product though.


Check out the Julia pycall library. It allows arbitrary python calls from inside Julia with nice autogenerated bindings from python objects to Julia types.


Check out the Python JIT compiler NUMBA. It compiles annotated Python and NumPy code to LLVM (through decorators). It's been wicked fast for my use cases: http://numba.pydata.org/


Everyone's answer to that question will be different. In my opinion, there's lots of things to love about Julia.

First class arrays and array literals. It's a wonderful thing… like Matlab but very smartly designed.

The type dispatch system makes so much sense for mathematical work. It's simply how math is done. And Stefan Karpinski (co-creator) often compares it to linguistic grammars, too, which may be a stretch but I think there's some truth to it. It just feels right. And it makes things very extensible, right down to the core language.

And the core language is indeed mostly Julia itself. Compared to NumPy where things are often implemented in C or Cython. I've tried to hack on some Cython things in NumPy and was immediately turned off. It was so hard to debug and run interactively.

Julia's interactivity is wonderful. The IJulia project brings over some of the best user experience of NumPy (in my opinion)… which is not NumPy but IPython.

And the community is so very great and supporting. The package system is such a great asset and really lowers the bar to entry.


Interesting that you mention IJulia. My concern with it is that when you are trying to develop a new technique or algorithm, the idea of introducing extra layers of code running in another system (in this case IPython) seems like a lot to deal with. Maybe I'm just a wimp ;)


But it's not running through Python at all (as I understand it). The kernel is all implemented in Julia. It just uses the IPython frontend and architecture.

See this post on the implementation of IHaskell[1] for more details. My understanding is that IJulia uses the same concepts. Notably, when running IJulia, you can't even use %%magics to change back to Python mode.

1. http://andrew.gibiansky.com/blog/ipython/ipython-kernels/

(Also, it's firmly endorsed by the language creators. Stefan recently gave a talk using IJulia).


Thanks for the explanation and link. I think you are correct in that it doesn't appear to be using any Python. The demo I saw was given by Fernando Perez [http://www.youtube.com/watch?feature=player_embedded&v=F4rFu...] where he demonstrated a cross language example which, whilst technically impressive, wasn't something I felt I would attempt. Like I said I'm probably just a wimp.


Agreed, Python with NumPy is definitely a key player in this space, probably more significant these days than Matlab. Don't forget Octave which continues to hold it's place as a Open Matlab compatible(ish) option. Whilst I'm a fan of R for experimentation and prototyping it is often let down by poor performance, particularly on matrix calculations. R's forte is really in providing reference implementations of an amazing array of statistical methods, often by the author of the technique.

One of advantages of Julia touted by the authors is that much of the Julia system is written in the Julia language making it easy for users to understand many of the algorithms and contribute to the system. In practice I don't know how true that is (it seemed to spend a long time compiling C/C++ code when I last built it) but I can see the rationale.


Julia has a community that doesn't feel threatened that their language is waning in popularity in some fields, and therefore doesn't feel the need to defend it every chance they get.


No need to make this personal. It was the original blog article that concentrated on the negative things and did not do a whole lot at explaining Julia's benefits.

I really love Scipy and friends and I also think Julia is a promising system.


I remember there being talk of eventually being able to call Julia from within Python. I've also been quite happy using Numba as an alternative to Cython for some things when I need speed. It's a lot more light-weight with less boilerplate, although still a little rough around the edges.


I don't know about calling Julia from within Python, but you can call python from within Julia. This makes it very easy to wrap and use python libraries for things which Julia doesn't have good support for yet.


Would be awkward to run django from inside julia though...


This already exists as a prototype as part of IJulia, see Leah Hanson's awesome blog post about it here: http://blog.leahhanson.us/julia-calling-python-calling-julia...

The collaboration between the scientific Python and Julia communities in recent months has been awesome to watch.


I used R day to day, but it's NOT as easy as MATLAB. There just isn't the same level of documentation and clarity.


Julia offers freedom from wondering whether you're betting on the wrong horse by coding for Python 2.7 or 3.x.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: