Hacker Newsnew | past | comments | ask | show | jobs | submit | rolleiflex's commentslogin

Similar, but slightly different story for me. I ended up buying it as an enthusiast ‘Apple-grade’ product where UX was there to do something I would be able to do on my own. Then they got high on their own supply and started to believe they can be as restrictive and up charging as Apple, forgetting that they’re still a product for primarily fairly technical people.

Also, for all server needs I’m running a Raspberry Pi at a single digit fraction of the ongoing power use of my Synology, and it just no longer makes sense to have this weird rare platform as my base when I could just be running things on Debian and systemd.

More philosophically, life got busy, and I no longer have the mental capacity and willingness to maintain something like a Synology. The only large content I back up are my family’s photos and I just pay Apple for iCloud monthly, I consider that to be money well spent.


> I just pay Apple for iCloud monthly, I consider that to be money well spent.

I use iCloud Photos for my photos, so I don’t have to manage storage on my phone, while always having access to everything. I quite like it.

I also have a Synology NAS for other things.

A little voice in the back of my mind is telling me to also backup my photos to the NAS, because I have no idea how Apple is backing things up. I might be willing to pay for 3 copies for just my photos, but is Apple going to do that for all users of iCloud without advertising it? Probably not.

I’m not sure the best way to go about doing an initial backup to the NAS, or the ongoing changes. I think it also gets a bit messy with Live Photos… which is another reason why iCloud Photos is so appealing, if it can be fully trusted.


The nightmare scenario is that Apple locks you out of your Apple ID for some reason.

Luckily, Apple also provides a pretty easy backup path that lets you have a local copy, if you have a Mac and a NAS:

- setup your Mac’s photos app and iCloud to download everything locally

- setup Time Machine backups from your Mac to a NAS

That’s it. You get 3-2-1 (your Mac, iCloud, and your NAS) and can get a copy of your data even if your Apple ID gets locked out.

Standard disclaimer, only the Time Machine copy is a true backup (ex if you delete a file by mistake, only Time Machine can help you restore it; iCloud is a sync, not a backup). That said, for me personally, this scheme (local copy + cloud copy + NAS backup via Time Machine) takes basically 0 work to maintain once setup and gives me peace of mind.


This works as long as you have enough storage locally. Our photo library is ~3TB split over 2 users, and while you could theoretically use an external SSD for storage, that kinda cuts down on mobility. You could leave the drive attached and drag it around, or detach it and lose access to your photos on the go.

For a long time, I had a Mac mini running 24/7, where each user was logged in (via Remote Desktop), and that would synchronize photos to an external drive, and the Mac would then make backups (via Arq) to my NAS as well as a remote location.

I don’t count the Mac copy in my 3-2-1 as it is basically sync (each side, iCloud and Mac, are sync), and without versioning, ie APFS snapshots, if one side goes bad, so does the other.

I’ve since switched to using Parachute for day to day backups, and every ~6 months I make a manual full export of the photo library in case Parachute missed something.


I thought about going this route, but I have 73GB of photos currently, which will only continue to grow over time.

While not the biggest library, it’s approaching the point where I’d need to start buying upgraded storage on any new Mac I buy, or use external storage for my Photos library. One of the things I like about iCloud Photos is my computer doesn’t need much local storage, Photos will manage it, downloading full res images on demand and purging them as needed.

I’d want a backup solution that is optimized for this, to allow for backups of the originals, without having to have them all downloaded all the time.


Makes sense. Unfortunately closest thing I’ve seen is https://github.com/boredazfcuk/docker-icloudpd but that requires turning off Advanced Data Protection which is a nonstarter for me


A large library becomes hard to manage.

The family one is somewhere around 759gb. Having this stored locally fills a decent size drive so it needs to be on network storage. Macs don’t love doing this, and somehow it’s difficult to keep a file share mounted 100% of the time on macOS (though it’s 100% reliable on an Ubuntu vm hosted on that same mac).

I concocted a vile script to download iCloud Photos and then save them to a Synology.

I’m looking hard at UGreen or Ubiquiti do my next NAS. The Synology thing where you can put same or larger drives in the array is probably the only bit I’d miss at this point.


Can’t say anything about UGREEN, but UNAS with Unifi identity endpoint is magic on a Mac. You install it, sign in with your UI credentials, and it automatically mounts all shares you have access to whenever you’re on a network where the NAS is reachable.

It works on my LAN, but also over my site to site VPN from my summerhouse, as well as my road warrior wireguard VPN.


Apple uses a mix of Google Cloud and AWS, as well as their own data centers. As for Google and AWS, they are using multi geographic redundancy, and I can only assume they do that for their own data centers as well. The data in the 3rd party data centers is encrypted.

That means, at least for Google and AWS, that your data is being stored with redundancy not only in a single data center, but in multiple data centers, so that if one data center completely vanishes, your data will still be available.

That being said, it's always good to make a local backup. I use a tool called Parachute Backup (https://parachuteapps.com) on my Mac to automatically export photos from Apple Photos to my NAS. It also works on "iCloud optimized storage", so it won't just backup size optimized photos.

I've tested it against Photosync (https://www.photosync-app.com/home) as well as a manual export of unmodified originals, and in a library consisting of 180k photos and videos, I had 300 compare errors, most of which were Live Photos, that are not exported identically.

Both Parachute and Photosync offers the ability to export unmodified originals along with AAE files, so that if you need to rebuild your Apple Photos library, everything including undo history is preserved (AAE files contains edits).

Tools like Synology Photos and Immich (and more) only exports the "latest" version, whatever that may be, meaning if you have edited the photo on your phone, that edited version is exported, and if you later restore from your NAS backup, there is no undo history. In other words, they apply the edits in a destructive way.

For backing up from the NAS to another location I use Arq Backup (https://www.arqbackup.com), which also supports backing up iCloud Drive files that are cloud only.


Parachute Backup looks very promising, thanks. I’ll have to spend a little more time later checking it out and seeing if that’s the direction I’ll go.

I do have my NAS backed up to Synology’s cloud backup service. I don’t love it, and it seems expensive, but it was easy to setup at the time and gave me some peace of mind for that data. The big issue I see is that I feel like I’d be stuck buying another Synology to restore of my current one fails.


I just backup the entire photo library with kopia. Is that as good as what you do?


It depends.

Do you use iCloud optimized storage, or do you download originals to your machine ? Kopia only backs up what it can see, and in case of iCloud optimized storage, it only backs up size optimized miniatures and not the original files.

Second, I haven’t researched this, but iPhoto used resource forks and extended attributes quite extensively for its library, and if the same is true for Apple Photos, Kopia will not pick up those, but Arq will. That was the very feature that caused me to purchase Arq all those years ago.


Interesting. I don’t optimise the storage so that’s not a huge concern but the extended attributes possibly is. Does Arq support s3 apis?


yes, but before you "panic", look into if it's really a problem. Kopia is a fine tool, and while it's a bit lacking in native integration, it works well, and for some things even better than Arq.

Arq wins on system integration. It supports waking the machine from sleep at backup time, keeping it awake while backing up, materializing cloud only files, and many other "nice to have" things.

Kopia however wins on efficency and speed. The same ~3TB backup (incremental) done with Kopia and Arq here finishes in 3-5 minutes with Kopia, and 45 minutes with Arq, and Kopia supports deduplication across multiple clients, meaning if you backup your family photo library from multiple locations, Kopia will only store it once.

In any case, even Apple doesn't recommend backing up Apple Photos libraries, and instead recommend exporting the photos, which is what i do with Parachute.

https://www.arqbackup.com/documentation/arq7/English.lproj/s...


Yep. I setup Parachute this morning and now backup that export using a script. Pretty easy.


Immich isn't completely terrible for "backups". It takes forever to upload everything though, as it has to download every photo from iCloud and then upload it to Immich on your NAS.


They did have a bug that corrupted images on import I think? https://news.ycombinator.com/item?id=45274277

So maybe don't fully trust it.


You’re talking about completely different aspects of the product though?

The second top level comment also suggests that this the cameras manual suggests the camera itself might be corrupting things.


Sure, it's a different part of the product, but still worrying that it may have happened at all.

Looking at it further though, you're right, this probably wasn't apple's fault.


> More philosophically, life got busy, and I no longer have the mental capacity and willingness to maintain something like a Synology. The only large content I back up are my family’s photos and I just pay Apple for iCloud monthly, I consider that to be money well spent.

I'm more or less in the same situation.

I no longer use a NAS for my "daily driver", and as such it made sense to skip Synology and instead go for the cheaper option, which in my case was the UNAS Pro (only model available at the time).

Next to it sits an "old" Mac Mini M1, which hosts my Plex server, with storage provided by the UNAS over 10Gbps ethernet.

Everything else i might at some point in time have used the Synology for, has instead been delegated to iCloud. Documents, photos, and everything in between is stored there, and each laptop makes a backup with Arq backup to the NAS as well as another cloud provider.

My NAS today is literally just an advanced USB drive attached to a server, and that was also part of my considerations at the time, just getting a DAS and plugging that into the Mac Mini M1, but ultimately the UNAS Pro (with 10Gbps networking) was cheaper than a Thunderbolt DAS, and i already had a switch capable of 10Gbps.

I made a similar "journey" some years back, where i removed pretty much everything cabled from the network, and instead moved everything to WiFi, and instead doubling down on providing "the best" wifi experience i could, which today means WiFi 7 with 2.5Gbps uplinks, hence the 10Gbps switch.

My network is 100% private. I don't expose ports to the internet, meaning maintenance is no longer a "must do" task. The only access is via Wireguard, which can be done with an always on profile that routes traffic for that specific subnet, but more realistically is mostly never used. The most remote streaming is done via a site to site VPN from my summerhouse to my house, where i can stream Plex over.


A problem I had with CityMapper is that at the time I last used it, it used the distance units of the city you’re in, with no ability to change it. For example, if you’re in New York, the distances will be in feet, and feet only.

Since I already know the public transport of my my own city and that I reach out for CityMapper when I’m travelling, it’s a jarring omission. I was incredulous enough to check with support, and sure enough they confirmed as of last year at least it is indeed the case that the units cannot be changed.


This has always driven me nuts about Google Maps too.

Contrary to what some PM apparently believes, nobody taught me kilometers on the plane.


That's a pretty big oversight on their part. But when I'm in a tight spot (like train service is ending late at night) I've had CityMapper save my ass a few times when Google was showing me inaccurate information. This has mostly been in NYC and London.


Turkish also retains the hard C in some forms, the city of Kayseri in Asia Minor is also from Caesarea. However some of that has been eradicated by the more recent influence of French. An example of that is Julius Caesar, which is Jül Sezar.


I’m the maintainer of this project. While it is good to see it posted here, I should note that Aether is on a hiatus for now.

I might eventually get back to it, and I’ve been working on it since 2013 so it is a long lived work for me, but I currently have zero capacity to support it in any meaningful way.

That said, if there are any serious would-be maintainers interested I would be happy to review code and eventually distribute commit rights though.


I pulled the repositories, built it, and was looking to get involved.

If you or others are interested in working on it with me I'd be open to it.


I'm following the instructions on the post from the original owner of the repository involved here. It's at https://til.simonwillison.net/llms/llama-7b-m2 and it is much simpler. (no affiliation with author)

I'm currently running the 65B model just fine. It is a rather surreal experience, a ghost in my shell indeed.

As an aside, I'm seeing an interesting behaviour on the `-t` threads flag. I originally expected that this was similar to `make -j` flag where it controls the number of parallel threads but the total computation done would be the same. What I'm seeing is that this seems to change the fidelity of the output. At `-t 8` it has the fastest output presumably since that is the number of performance cores my M2 Max has. But up to `-t 12` the output fidelity increases, even though the output drastically slows down. I have 8 perf and 4 efficiency cores, so that makes superficial sense. At `-t 13` onwards, the performance exponentially decreases to the point that I effectively no longer have output.


That's interesting that the fidelity seems to change. I just realized I had been running with `-t 8` even though I only have a M2 MacBook Air (4 perf, 4 efficiency cores) and running with `-t 4` speeds up 13B significantly. It's now doing ~160ms per token versus ~300ms per token with the 8 cores settings. It's hard to quantify exactly if it's changing the output quality much, but I might do a subjective test with 5 or 10 runs on the same prompt and see how often it's factual versus "nonsense".


I also noticed hitting CTRL+S to pause the TTY output seemed to cause a reliable prompt to suddenly start printing garbage tokens after CTRL+Q to resume a few seconds later. It may have been a coincidence, but instant thought was very much "synchronization bug"


Don't you hate it when someone interrupts your train of thought.


What do you use it for, out of curiosity? Can it do shell autocompletes (this is what “ghost in the shell” made me think of, haha).


Nothing. It's technology for the love of it.

I'm sure there are potential uses but training your own LLM would probably be more meaningfully useful versus running someone else's trained model, which is what this is.


IIRC Based on the latest numbers 70+% of Turks do hold their savings in foreign currency. So it’s accessible and widely used. It’s probably one of the reasons Turkey is weathering the storm better than expected, though it’s still a pretty bad situation. This corroborates with my personal experience as well.


I have a 32Gb M1 Pro as a work laptop and connecting a retina 4K external display is not a fun experience for me either. It was very surprising to see it fail to maintain 60fps in the OS’ own animations.


That seems like something is wrong - if this is a known issue with such capable hardware I'd find that shocking


Are you using the built-in HDMI port, or a dongle? I had the same issue before switching off the dongle.


This is through USBC which also charges the laptop.


Is the connection handling HDMI natively, or via DisplayLink? I could definitely see performance issues being accentuated for DisplayLink.


‘ One of Phantom’s bestsellers is a repeater crackling palm break called “Shaggadelic Mojo,” whose flower-strewn box also features a lava lamp, a beaded curtain and a disco ball. It dates back to the late nineties, when Austin Powers was cool.’

Is this a fly trap to catch old people? I thought lava lamps were mostly a 60s/70s thing.


Remember that these are completely disposable products. You set it off, enjoy the kaboom, and that’s it. So you don’t have to be, like, into the aesthetic to think “that might be fun to see”; it just has to be interesting enough to add to your pile of fireworks.

It’s more like watching Austin Powers, not wanting to be the character.


Austin Powers isn’t cool anymore? I’m pretty young but that flytrap would get me.


I'd even go as far as to say that's groovy, baby, yeah.


I'd definitely buy Fat Bastard themed fireworks.


While optical Thunderbolt cables do exist, they are a niche product. Most Thunderbolt cables in existence is comprised of good old metal wires.

https://www.corning.com/optical-cables-by-corning/worldwide/...


Ah. I was mistaken -- I thought that TB3 requires fiber cables, while USB-C allows for copper.


Here’s something better I’ve found to improve on it: you can get a $30 massage gun from Amazon with one of those round soft heads, gently put it to the back of your head where doing the manual exercise above hits with your fingers and run it at the slowest setting. It’s exactly the same thing, and it’s very effective for shutting off tinnitus. I do this when mine gets bad.

I’ve also noticed my tinnitus is inversely proportional to the last night’s sleep quality for me, so the article bears out on an anecdotal standpoint.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: