Still running a 2009 AirPort Extreme at our house. I haven't yet brought myself to replace it, even though I'd probably see a significant benefit from 802.11ac support and better ipv6 support
It has been reliable for nearly a decade with barely a few thoughts. Much more than I can say about any previous router I had.
I can't decide whether to go as open-as-possible-but-more-expensive, or with something like Ubiquity, or what.
It depends on the home. I use 2.4GHz and disable 5GHz because it causes a lot of issues.
The latest issue I have is where the phone I'm writing this on keeps switching from 2.4 to 5GHz when in the bedroom. However, when it switches from 2.4 to 5 it becomes unusuable due to the poor SNR ratio. I wish I could pin it to 2.4GHz, instead I keep having to wait for it to drop the connection and renegotiate on 2.4GHz.
I'm also running a 5+ year old Time Capsule, extended by an Airport Extreme that was given to me. It covers our house just fine, and even though there are newer technologies, I don't think it would make much of a difference since we only have 18 Mbps service. The TC has started conking out a couple times a week, but I don't know when I'll actually replace it. Having a wireless backup is handy, but unfortunately my Airport Extreme isn't new enough to support this functionality (only the 2013 model does). For now, I'll put up with moderately frequent rebootings to make sure that my machine is always backed up.
I use a lot of JSON and YAML via libvariant [0], a set of C++ libraries. e.g. the core class Variant is a JSON-ish object implementation; there are pluggable Serializers & Deserializers, etc.
Libvariant includes a command line tool, varsh ("Variant shell"), that can schema-validate JSON and YAML documents.
So this situation happens in the news world all the time. While a company or agency has original databases, excel sheets, what have you - they don't consider that publishing them in a "human-readable" format is nearly the same thing as publishing the raw data. Try calling the place for a copy, and they'll hang up on you. But, they won't think that a crafty outsider can probably reconstruct the original by scraping.
What's particularly interesting here is guessing the motivation behind publishing. Was the information a trade secret, or did a middle-manager want to show that their team is ahead of the others? Or are these feathers to show the company has the know-how and capability?
In either case, most of the web-published data isn't initially considered as published data by the publishers, who in turn don't think to state any restrictions governing the data. That's when we scrape and make use of it - and even if there are restrictions on republishing, you can still perform and claim transformative derivative work.
The fun legalese part is what happens when they discover what you're doing and try to lash out, or interrupt a standing scrape. One time, all it took to unblock access was to show up at a meeting and get yelled at by a police captain for 30 minutes. Our retort started with "In the interest of public safety, ..."
> One time, all it took to unblock access was to show up at a meeting and get yelled at by a police captain for 30 minutes. Our retort started with "In the interest of public safety, ..."
Well, the PD found out that we were scraping and publishing data when a superior asked them about it. They were embarrassed and ambushed. Imagine your boss asking you "hey data guy, when did we start sending data to the paper?"
The data itself was public safety information and there was every reason to publish it. Anyhow, our access got cut off and when we inquired about it, they setup a meeting at their headquarters instead of providing any answers. That morning, I showed up at their deathstar-looking building with my editor and we spent 30 minutes getting chewed out by guys in uniforms, suits and badges for "incorrect geocoding" and other false information that we were publishing.
We said that yes, there were some errors but that we took every reasonable attempt to validate it (see http://pp19dd.com/2009/02/vessels-in-distress/). After the guy running the show vented, he showed us the proper way to geocode and correct errors during which time I was thinking "uh, why not send us the lat/lng that you're showing us here, instead of berating us?"
The compromise was that they'd add "precint zone" information to the dataset, and we could proceed so long as we checked whether a geocoded point was within the zone. We promised to check this process with a point-in-polygon algorithm, and the guy was happy as a clam that we took note of his work and gave him respect. After that, he eased up and showed us some of the other cool stuff the PD data guys were working on. For example, they pre-plot escape vectors for burglaries so when cops are dispatched, they first go to where bad guys are likely running to, not where they ran from.
I've been using the TextMate 2 alphas for several years as my daily driver text editor (mainly for: C/C++, python, shell scripts, CMake, JavaScript, HTML, CSS, and occasional advanced editing for email and other documents).
I occasionally use vi(m) in a terminal for quick tasks, but I find TextMate to be a very comfortable environment for a combination Mac user/command line jockey.
The compiler for the TI C55x family [1] of fixed-point DSPs is implemented that way: sizeof(char) == 1 and sizeof(int) == 1, both 16-bit fields. A practical consequence is that you can't easily use the same struct header files, for example, on a more conventional platform. Another is that ASCII strings use twice the space.