Hacker Newsnew | past | comments | ask | show | jobs | submit | jon6's commentslogin

This is the most useless academic 'discovery' I've ever seen. Am I wrong?


The number zero was considered useless for a long period of time [1].

A certain piece of academic information isn't useless just because you can't think of a use for it, so I'd say yes, you're wrong.

[1] http://www.amazon.com/Zero-The-Biography-Dangerous-Idea/dp/0...


> The number zero was considered useless

It still is as far as I can tell. Try adding/subtracting it - it makes no difference. Multiplying it just gets you back the same useless number no matter what. Dividing by zero? Don't even go there!


I can't tell if you're trolling or being willfully ignorant?

Zero represents the state of "nothingness" - the universe before anything exists. The blank slate, tabula rasa, etc. The empty page before you write anything it.

Zero is a very deep concept and it is far from useless. See the book "Zero: history of a dangerous number".


Being funny I think.

I'd think it's actually a valid point though, if the question is put in the form: Does zero (or its reciprocal) have any existence beyond being a useful theoretical construct? Can the physical world actually reach zero, or only asymptotically approach it?

Is the universe infinitely big? Can infinitely small things exist? Can a thing be said not to exist (ie. we have zero of it), or is there always a miniscule probability of it spontaneously appearing due to quantum effects? Is a vacuum really empty? If we have zero, how do we measure it in the face of quantum uncertainty? And so on...

---

Edit: grammar


>Can the physical world actually reach zero, or only asymptotically approach it?

Five minutes ago, I had zero apples in my hand. At this exact moment, I have no way of knowing how many apples are in my hand due to signal delay and processing time.

Numbers as we know them are only useful for describing the past, but at that task they can work perfectly.


Rather, your brain is telling you that you had zero apples in your hand five minutes ago. Given that any measuring tool (including a brain) is a physical system, isn't it also subject to fundamental uncertainty?

Granted that the probability is negligibly small for uncertainty causing two measuring devices (such as my brain and yours) to return different answers for how many apples were in your hand five minutes ago, but is it truly zero?

Also granted that in practical terms it's not worth arguing over, and I don't propose that such possibilities should be taken into account in everyday life.


We may have flawed measuring devices, but they are attempting to measure something with a real, constant value. The fact that we are not directly connected to reality doesn't mean that reality is an illusion.

If, in reality, there were zero apples in my hand, I could say that I was holding one apple, but then I would be wrong.


Yeah, learnin' is dumb.


How is the engine for BFG different from the older Doom3 version?


http://en.wikipedia.org/wiki/Doom_3#Doom_3:_BFG_Edition

The BFG Edition features enhanced graphics, better sound with more horror effects, a checkpoint save system, and support for 3D displays and HMDs.

Also on the Doom wiki:

http://doom.wikia.com/wiki/Doom_3_BFG_Edition


The community fork of Doom Wiki tends to be a little bit better up-to-date on the whole:

http://doomwiki.org/wiki/Doom_3:_BFG_Edition


Not that we don't have enough of them already, but this release comes with a new port of the Doom1/2 engine as well[0]. I don't know either codebase enough to say what's different, but it's now C++ vs. the original's C[1], and appears to be some Xbox and PS3-specific code as well (not that it's useful to mere mortals)

[0] https://github.com/id-Software/DOOM-3-BFG/tree/master/doomcl...

[1] https://github.com/id-Software/DOOM


I wonder what's comming up next as the new game industry standard programming language after C++?

People like, John Carmack have 'personally' invested a lot in C++ (with DX and GL)


But with all the stuff ripped out of DOOM and DOOM2 then my original 3.5" floppies might be worth something one day.

If they still work...


Don't see how this re-release would have much effect on their value, given that the original games themselves have been "preserved" by many countless times over across the Internet, likely since Doom's launch. Not to say the disks are any less of a collector's item.


I was thinking that as the "new" versions make their way into the market the original versions might be harder to come by. But I agree, it's more about the physical disks with labeling than the content. I wish I still had the boxes.


I believe some tech from id Tech 5 was back-ported into it. Wikipedia says it also has support for 3D [head-mounted] displays.


Yep. John Carmack stated that the BFG edition is actually Doom 3 in the Rage "framework". E.g. Rage's network layer and (probably?) all the behind the scenes stuff like input and file management, boilerplate for different GPUs etc.


People named the obvious things, but from doomwiki[0] I noted those:

    > * Flashlight-projected character shadows are disabled.
    > * More light sources in the levels.
The latter may make the game a bit prettier, but the first one was one of the few things that made Doom III stand apart, and might be a letdown.

http://doomwiki.org/wiki/Doom_3:_BFG_Edition#Differences_wit...


As far as I can tell, not much except the addition of stereoscopic 3D and various optimisations.


It appears to contain support for the Oculus Rift VR headset. http://www.oculusvr.com/

John added Rift support to Doom3 BFG and it was the primary demo when the Rift was displayed at E3 and subsequent (kickstarter) tours.


Where is this code that implements the Oculus support?


I vaguely remember using this knowledge of the scales to "cheat" on the violin. If you move your hand into whatever position puts your first finger on the first note in the scale then you can always move the rest of the fingers in exactly the same manner - whole whole half, whole whole whole half. My music theory knowledge is crumbling but what I mean is if you move into 2nd position on the A string to play a B scale its much easier than starting in 1st position with the 2nd finger.

I did this at a competition once when given some insane scales (like 3 flats in the clef) without having ever practiced the scale itself and got it right.


I tried it on my kubuntu i386 netbook earlier today. The ppa installed ok and when I ran 'netflix-desktop' for the first time the wine installer downloaded and installed mono, then proceeded to install firefox. I noticed some errors but didn't look at them very carefuly.

I ran 'netflix-desktop' again and it said 'firefox.exe not found' at which point I gave up.


Is there a youtube video?

I found this http://www.youtube.com/watch?v=NDGl-eDEuNQ but I'm not sure if its the same game.


To understand Lisp is to understand interpreters. With that understanding you can create domain specific languages which is extremely powerful.

But I wouldn't recommend using Lisp itself.. macros in particular are unhygienic.


Common Lisp != Lisp. You mean Common Lisp, Lisp is the family of languages (which also includes the Scheme sub-family, Racket, Clojure, Arc, Kernel…).


You are right, but I disagree. I almost always call "Common Lisp" "Lisp." Scheme is Scheme, Clojure is Clojure, etc etc. I don't care about the family vs language distinction. I think it hurts Common Lisp's adoption. I'd sooner call Common Lisp, Scheme, Clojure, etc part of the "Lisp family" instead of just "Lisp," and leave "Lisp" to mean "Common Lisp."


Its disingenuous to call scheme and racket lisps. They have parenthesis and first class functions but the similarities stop there.

So yes I equate Lisp with common lisp. I didn't read the entire article (far too long) but he does mention 'defmacro' which is in Common Lisp.


So, homoiconicity is a trifling, meaningless similarity?

"Sure, it may be homoiconic, use prefix notation, have first-class functions (in additional to all the other usual functional paradigms that aren't unique to lisps) but it's not a lisp."

Big ok to that one. This must be pedantry of the highest caliber, not ignorance.


They are not homoiconic. The underlying datastructure for many schemes, and racket, is not a list. It is a syntax object. Of course you can still do metaprogramming with syntax objects but I wouldn't call it the same thing.


You should be careful here, and not lump together "many Schemes" and "Racket" (or other specific Scheme implementations). The thing is that Scheme standards have traditionally avoided tying the language with a macro system that requires some specific representation for syntax -- giving you only the simple rewrite rules system means that you don't actually need to know that representation.

In Racket, OTOH, there are definitely syntax objects with enough functionality to write code that handles them, and I suspect that you know that. The question is whether this should be considered "homoiconic" or not, but this is a kind of a subjective issue, since at an extreme, I can say that all languages that have strings are homoiconic. Perhaps you need more from the language to make it so, maybe eval, or maybe actually require it to have compile-time procedural macros? In any case, Racket will have all of the features that CL does, so it is arguably at least "as homoiconic" as CL is. But in fact, it has more than just s-expressions: these syntax objects are basically sexprs + a bunch of stuff like source location and lexical context, so in fact they represent more than what lists in CL do. Should I then conclude that Racket is more homoiconic than CL? And this is not a tongue-in-cheek argument: in fact, many CL implementations are aware of the limits of sexprs as good representation for code, and add things like source location via a backdoor, like a hash table that maps pair objects to additional properties. Racket does that in its basic syntax representation so IMO it's fine to indeed consider it more homoiconic. And I also say that for the addition of lexical context information -- that's something that is not only included in the Racket syntax object, it's something that you just cannot get in CL, so if homoiconicity is being able to have a high-level representation of code (unlike raw strings), then this is another point where Racket wins the pissing context.

Finally, it's not that all "many Schemes" are limited as described above -- there are many of them that have their own macro systems with similar syntax values, and that includes Schemes that follow R6RS since that dictates syntax-case which comes with them. It just happens that Racket is has been traditionally running at the front lines, so it's more advanced.


It's not really necessary to second you, but I'd like to add that "code as data" is more real in Racket than is CL since code is not just the AST, it's also (as you point out) location and more importantly, context. In this setting Racket' syntax objects are more "code as data" than "code as sexp" as it is in CL will ever be.


Right. Perhaps a better way to summarize this is that:

* Lisp made the first giant step of having code representable as data for meta-programming, and chose sexprs to do so

* Common Lisp came later, and made the important step of requiring this representation, which means that in every CL implementation you're required to have the code as data aspect

* But the flip side of this is that CL hard-wires just sexprs, it forbids an extended type, which means that you can't get anything more than sexprs (without resorting to "extra properties" hash table tricks)

* Meanwhile, Scheme (R5 and others that have only `syntax-rules') took a step back by specifying only rewrite rules which can be implemented in any way an implementation chooses

* But some Scheme implementations did use sexprs, but since they need to encode more information (lexical context) they extended them into syntax values (note that some Scheme low-level macro systems try to present users with a simplified interface where user code sees just the sexprs)

* Later on, Racket took further steps and enriched its syntax values with "more stuff"

* R6RS got closer to this too, by adopting the syntax-case system (but some people had issues with "wrapping" symbols, since you can't do that with the hash table trick)

* And finally, R7RS (the "small" version) is going to take a step back into the R5RS days. (And in the "big" language it looks like they'll adopt one of these systems that try to keep the sexpr illusion.)


Homoiconicity doesn't refer to lists, but to syntax being represented in the data structures of the language.


I am admittedly still a Lisp (et. al.) rookie, but isn't the entire point of Scheme that it introduces hygienic macros? Or are you referring to some other (perhaps sarcastic) notion of macro hygiene?


That's one design characteristic of Scheme, but I wouldn't call it "the entire point".


You can do cool stuff with unhygienic macros, however, like anaphoric macros. Interested readers should check out On Lisp by Paul Graham, as well as Let Over Lambda by Doug Hoyte.



[Most of] Let Over Lambda as free HTML: http://letoverlambda.com/textmode.cl/guest/toc

Graham's On Lisp as free PDF: http://lib.store.yahoo.net/lib/paulgraham/onlisp.pdf



hipster.. overload..


What style of dress do you expect for people in their 20's dressing casually? This is such a silly label that it's almost not worth arguing over, but I fail to see what expectation has failed for you here.


I am currently visiting/working in NYC and staying on the Lower East Side. Seriously, everyone here dresses like that--this hipster thing goes way deeper than I could have imagined...


I 'm kinda hoping they were being ironic


What about that is hipstery? That's how people dress here in NYC.


my exact thoughts. literally every single person is a hipster.


All I see is just a bunch of people in their 20s/30s...


Young person who dresses in a non-boring way = hipster, apparently.


You have to give the trend a name so you can refer to it after it becomes boring itself.


I think it’s just called fashion.

That hipster bullshit is getting tired. The stereotypes are stupid and it all comes down to berating people for what they wear – for whatever stupid reason. Why would that ever be cool?

All I see is lots of young people who know how to dress fashionable.


The trend has been named for at least a decade.

Oh, I see what you did there.




The point is that the word "hipster" has been in continuous use to describe alternative youth culture since the Beats.

http://books.google.com/ngrams/graph?content=hipster&yea...


As wikipedia explains, the term has been used to describe the culture of 2 distinct decades, not the decades in between


Where does Wikipedia say that it wasn't used in the intervening decades? That's not true. Look at the ngram link I supplied – it clearly was in use. What do you think it meant?


The major issue he has seems to be that visual studio is much easier to use than the command line + makefiles. Ok that may be but if you are porting to linux then you probably are already familiar with the command line and makefiles. Supporting gcc on windows from a makefile that you already need on linux is not super hard.

But in any case I find visual studio a horrendous piece of garbage.


Also Nacl supports threads while emscripten does not. For a large set of applications (including my own games) this is a deal breaker to using emscripten.


Beej's networking guide is pretty good

http://beej.us/guide/bgnet/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: