I did switch from 2012 Macbook Air to 2019 Macbook Pro a few weeks ago and I can't agree more with the article. One of the reasons I'm using Mac instead of Linux is that I make music. And the new Macbook sucks for this purpose: the same recording settings I used on my old laptop are not usable on the new one. 90% of time things work fine, but sometimes I get glitches and pops in the middle of the recording. Why? Who knows. I did everything I could like closing other apps, disabling ton of things running in background; I freeze tracks in my DAW but things still break sometimes. I really hope some update from Ableton/Apple fixes things soon; if not I'm considering downgrade from Catalina.
But wait, there's more! Sometimes I got glitches even when I'm not recording but playing tracks. I have this problem on literally the same Ableton projects that used to "just work" on the old machine (same number of tracks/plugins). The workaround is to restart the laptop. Seriously, it feels like Windows XP all over. I can't imagine playing live shows with such unreliable setup - fortunately I just make tracks for my own, but if I were to go out for some kind of jam session or open mic I'd consider switching to Windows.
EDIT: One more thing that sucks about new MacOS: dropping 32 bit compatibility. RIP all 32 bit VSTs. Fortunately I don't use that many plugins and it turned out everything I really care about has 64 bit version. But if it didn't, then I'd hold on to the old OS as long as possible.
My wife's been having repeated AV glitches on her '19 Macbook Air. Video audio cuts out after a few minutes very frequently, across browsers and even on local players and for audio on locally-downloaded podcasts. Video calls do similar things, which has, you know, been a problem. Best I can figure is there are some bad software + hardware glitches with some of the Siri-related stuff and maybe the fingerprint chip. Disabled Siri but it still happens. Apple store's seen it and reset it and been like, meh, it's fine, but it's very much not.
Never seen this kind of flakey-out-of-the-box bullshit from Apple. I really need a new machine but I'm holding out for a better run than what they have now. And maybe some undoing of their recent price hikes (they used to drop prices, substantially, with some frequency, in the distant past of a few years ago and earlier). Plus all the damn dongles. My '14 has lots of ports and zero USB-C and those ports are great, and I still don't have anything that's USB-C aside from adapters for work laptops, so... why do I want only USB-C again, and no other ports at all? Maybe in another 5-7 years it'll make sense. Give them time to switch all their iDevices to USB-C and for me to replace all those (they last many years, that's why we buy them). Then at least my laptop and iPads and iPhones and such can share peripherals, which is moving toward some kind of benefit to offset the loss of ports.
> I still don't have anything that's USB-C aside from adapters for work laptop
The USB-C thing is culturally a bit curious. It's been three or four years since reviewers of laptops and the like started complaining if they didn't have USB-C ports - but there still seems to be very little need for them. The few devices I've acquired that have a USB-C socket on the device have all come with a cable having a USB3 type A plug on the other end, never a C to C cable.
(I see the article mentions this, but I found it striking because to facilitate home working I've just bought two new devices with USB-C at the device end, and both came with cables to an A socket)
I don't get why they didn't do a transitional design for a few years. Replace the Thunderbolt ports with USB-C but leave the rest the same. It takes me probably damn near a decade to cycle even half my peripherals to a new interface, and I bet that's closer to the norm than people going out and replacing 100% of their stuff with USB-C and Bluetooth within a year, so going to using a port that I had not even actually seen yet in the wild when they switched cold-turkey was a baffling decision.
Hell, to this day the only actual USB-C devices I've seen in person are Apple peripherals and monitors catering to Macbook users, which almost all include other ports anyway. I'd add Android phones but those mostly have USB-A connectors on the other end of the cable (like the iPhones that were shipping for a while after the USB-C Macbooks came out—there's another giant WTF). USB-C market for flash drives seems to suck (form factor's worse, price is higher).
[EDIT] oh and when the new port is intended to replace all other ports the half-life on "legacy" devices for me probably goes from ~10 years to ~20. Seriously. WTF Apple.
Going USB-C on their phones and tablets while also swapping the Thunderbolt ports and maybe the power ports on their computers for USB-C, then leaving it that way for three years or so before removing the other ports, seems like it would have been a much better lead-in if they really had to do it. At least then we'd have been able to slowly accumulate USB-C dongles without feeling bitter about it, since they'd work on i-devices too so feel like they're providing some kind of benefit or future-proofness. As it is it's 2020 and I still feel like I'm wasting money buying either USB-C or Lightning dongles—the former because I don't need them for anything but very new Macbooks, and the latter because surely they're going to switch those to USB-C eventually (surely... right?) so those thunderbolt cables and adapters will become junk long before they should.
[EDIT] go figure, my wife just today bought a new iphone, wants to put some music from her 2019 Macbook Air onto it, needs to plug it in at least once in order to do that (have to plug in first to enable wifi sync), and can't because the phone shipped with a fucking USB-A cable and the single lightning-to-usb cable we have in our house is missing. This is going down right now. So much for not needing USB-A ports on a laptop in 2020 even if you're fully in the Apple ecosystem. She's so annoyed I wouldn't be surprised if she returns the phone. I mean granted the phone isn't the latest model but it came out way after they made the all-USB-C switch on their laptops.
What is concerning that is that other laptop makers are following Apple like lemmings. My current favorite laptop is the Dell Precision 55xx series, the business version of the XPS 15. The newest models of this line, the 15" 5550 and the 17" 5750, are USB-C only. Couldn't they have kept just one USB-A port? I have dozens of peripherals, and I literally have nothing that uses USB-C.
One of the main reasons that I left the Apple laptops after using 3-4 of them over 10 years was because of the removal of the USB-A ports. Adding new USB-C, great. But removing all of the older ones? A big middle finger to the users. Then there's the removal of the ESC key, the butterfly keyboard debacle, the touchbar that removes the function keys, etc. Sorry Apple, you are going off in a direction that I cannot follow.
User friendliness is a problem. There's no easy way to tell what protocols a USB-C device actually supports, whether it's USB 2, USB 3, DisplayPort, etc.
USB-C hubs are not possible like you could do with USB 2 or 3, so you're always limited to the number of USB-C ports on your machine. The only possible hubs are those that give out USB-A ports, which is fine now because most peripherals are still USB-A, but will become a problem when peripherals start using USB-C themselves.
USB4 will in theory fix most of this by setting minimum required capability at 20 Gbit/s with DisplayPort Alt Mode, and laying out support for hubs from the start. It still has a painful number of 'optional' alt modes depending on device type, though.
Just because USB4 becomes a thing doesn't mean people will support it. 20Gbit/s and the DisplayPort Alt Mode means people definitely will choose to go with USB3 or USB2 USB Type-C.
Remember, all USB Type-A ports aren't USB3. Likewise, all USB Type-C ports won't become USB4.
You'd probably find a more authoritative source and better documentation by searching, but here's my understanding of it:
USB-C can be used with alternate modes like DisplayPort or Thunderbolt, where the wires are no longer transmitting USB data but are used as a dumb pipe for the underlying protocol.
If you get a hub it can choose to either operate in the following conditions (depending on its type and capabilities):
* Upstream port is in USB mode, it can provide USB 2 and 3 to downstream ports, however the downstream device can no longer use DisplayPort or Thunderbolt.
* Upstream port is divided into USB 3 lanes and DisplayPort lanes. The DisplayPort side can be mapped to a single downstream port, but there is less USB 3 bandwidth available to share across the remaining ports. Still no luck if a downstream device wants to use Thunderbolt.
* Upstream port is Thunderbolt. Since TB is essentially PCI Express the hub could pass through some TB lanes to a single downstream port while having its own USB 3 controller to provide the USB side of things. Still no luck for DisplayPort downstream devices.
This doesn't even address the issue that not all devices can do Thunderbolt (my 12-inch Macbook can't) and it becomes very hard to find a hub that will actually work and do what you want. It took quite a bit of research for me to understand the specs so I could find a solution to do both 4K 60Hz video and USB (2 only, not enough bandwidth for 3) on my MacBook with a single USB-C port.
I won't even get into the bullshit naming conventions they use that they decided to apply retroactively. WTF is USB 3.1 Gen 1? Is the next one gonna be called USB 10 X Pro Max?
But I don't need or care about all that, what I need is a hub just like we had all those years that gives usb3 connectivity in a c form factor to 5 devices or more. If I need more there are plenty docking stations available.
edit: I also looked for a usb A hub with usb-c upstream, still no luck and stuck with a hub+power adapter and dinky otg a-to-c.
As far as I know the spec doesn't allow it for some reason. I guess one reason could be the power side (which I haven't mentioned and I am not familiar with it)?
Let's assume the hub requests 19V from the upstream port. Unless it has built-in, variable, per-port voltage regulation, it will not be able to respond to power requests from the downstream devices if they differ from what is coming from the upstream port.
This isn't a problem when providing USB-A ports because you either request 5V from upstream and pass it through or request a higher voltage and have a single 5V regulator and pass its output to the downstream ports.
See, you could technically design something that does that. Per-port up to 100W charging it definitely possible. Basically you have a USB PD controller on every port with dynamic voltage control.
This is all possible, it's just a HUGE pain in the ass (and costs a lot) to design that way. The problem is you essentially end up with potentially a giant, VERY expensive USB hub. And no one wants to spend $400 on a USB hub.
Which is why USB-C is bad, nobody asked for a protocol where the benefits are marginal but the downside (one of many) is that a proper hub that replicates the behavior of previous generations now costs more, and since nobody would pay that much, economies of scale don't apply and thus it doesn't exist at all or is available as very niche specialized units (kinda like opto-isolated USB hubs which go for $200.
You could also theoretically use 10Gbit Ethernet to transfer data from your phone but we don't go around putting SFP slots on phones because we don't want every single cable or charger to cost $100 and be more complex (thus more prone to failure) than needed.
The thing is, you're not replacing it with something identical. If you want to replicate the behavior of a previous generation USB2 hub, I don't think it would even be that expensive. You just need a USB port controller on each port and negotiate 5V, 1.5A on each of them in addition to the standard USB hub. It would be a bit more expensive, but not terribly so. USB3 may be similar but require some extra switches...
The ask for the 'really expensive' $400 one has vastly more options and capabilities than the traditional USB hub. It's basically a Thunderbolt hub with even more complexity, so it's not unreasonable to think those would be damn expensive.
I'm honestly not sure why no one has made a standard USB2.0 Type-C hub. There's really nothing stopping you from doing it. I guess maybe because it could be confusing for a consumer? I agree, that's a huge issue with Type-C right now. It's very unclear by looking at the port (or the cable) what it's actually capable of. If you're lucky there will be a tiny symbol indicating some subset of functionality, but often times there isn't.
Oh that's interesting. That might be an extra reason why devices are still preferentially coming with USB3 type A plugs on the other end - it's actually more flexible that way? I am in fact plugging both of my new USB-C devices into a USB3 hub, and had no idea I wouldn't have been able to do the same with USB-C cables and hub.
(I assume you can still have a USB3 hub with type A sockets on it, plugged in to a USB-C port? - edit: oh, just noticed a sibling comment says this doesn't seem to exist either. Now that is very odd.)
Usb 3 hubs with a USB-C upstream certainly exist, all the docks out there are functionally that, and there are also dedicated hubs as well.
Here’s the one i use:
Anker USB C Hub, Aluminum USB C Adapter with 4 USB 3.0 Ports, for MacBook Pro 2018/2017, ChromeBook, XPS, Galaxy S9/S8, and More https://www.amazon.com/dp/B07DFYQXY7
It's definitely possible, it's just a bit expensive. There's nothing in the USB-C spec that disallows you from doing this. Also if you're trying to do the thing where you have charging input but data output on one port, it gets complicated.
That said, I see no reason why you couldn't design a basic USB-C USB2 or USB3 hub. Hell, it could have 5 ports with no defined 'input' port (just 5 USB C jacks), and you could have that work properly with some pain.
If it has a captive USB-C cable and 4 USB-C outputs, that would be even easier to program. Again, it's just expensive. You need a lot of active circuitry to make it work.
Yeah, all my audio equipment (keyboards, controllers, interface) is USB A and I don't see producers switching en masse (and even if they do, do I really want to throw away my favorite keytar?). The only exception is Roli block but Roli is kinda meh - fun to play every once in a while but not something I'd like to use every day to lay bass track or piano chords. I've disabled a ton of various background stuff like Siri but this setup is still flakey. If I were trying to make money on my music I'd be really pissed now.
I think all my musical equipment has a USB-B port on them, work over USB 2.0, and I've got like a dozen USB-A/USB-B cables around. Those plug into every other computing device in my house except new Macbooks (and iPads, but, Lightning to USB-A adapter, so, still USB-A, and notably I can't share that adapter with a Macbook because Apple's approach to this has been a bizarre mix of foot-dragging and waffling on the iDevice side while running way ahead of the pack on the other). My enthusiasm level for buying new cables or adapters only for fairly new Macbooks when any other new computing device I buy would still have USB-A ports, and then have them maybe not work anyway because USB-C compatibility and cable/adapter quality is a shit-show, is exactly zero.
And I'm not made of money and I have other hobbies that require cash so I buy used musical equipment so I can get low-end pro gear a generation or two old for at or under new prices for amateur/crappy gear. There's little chance I'll buy any USB-C native musical equipment of any kind before 2030 or so. Who's cycling out all their gear—musical or otherwise—so fast that dropping USB-A and HDMI entirely in 2016, on a "pro" laptop, made any damn sense?
You do know that all you have to do is buy a USB-B to USB-C cable and you can connect your existing equipment to your Macbook, right? I mean it sucks that the new Macbooks have so few USB-C ports, but there's nothing preventing you from plugging in all of the older USB-A/B/micro/etc equipment directly into it as long as you buy the correct $5 cable with a USB-C connector on the other end. No dongles necessary.
As a former long-time Ableton user, just want to recommend Bitwig Studio[1] as a viable (improved?) alternative that runs natively on Linux. I've used it for live performances and label-released music for over four years now and other than the typical setup issues related to Jack (one time thing resolved in an hour or so) I couldn't be happier.
>>other than the typical setup issues related to Jack (one time thing resolved in an hour or so)
I was never able to get Jack up and running on my MBP. I tried both Jack and Jack2, compiling from source, using their prepackaged binaries, using the cli vs the GUI, etc, .. nothing doing. Do you have any resources you can share detailing how you resolved the issue or how you configure it? Feel free to PM, emails in profile.
I'm really tempted here. I've got a paid Ableton license, and I absolutely love Ableton, but it's the main thing keeping me from dumping Windows 10 for Linux.
Transitioning between Ableton and Bitwig is seamless from a UI standpoint. They have essentially the same controls.
The only thing that isn't great moving from Windows to Linux for production is your VST ecosystem becomes much more limited. Running plugins under WINE is alright, but if they're heavy you'll run into latency issues. There's a sizable collection of plugins that are native under Linux though[0]. I've used many of these with great success.
You can get pretty far with the built in instruments and effects in Bitwig, especially with the new modular grid. I haven't felt much need for third party stuff.
I make music on a 2018 iPad Pro and now I can never upgrade that because they've removed the headphone jack on the new ones. What's "Pro" about an iPad without a headphone jack?
* this is a totally legitimate product decision because no one likes wires on their headphones since like 2016
* what you really want is bluetooth, especially airpods -- have you bought your airpods yet? Ecosystem FTW!!1! -- even though they introduce latency that's unacceptable in some situations
* Apple knows best, and look, you don't get a career as a bold product thinker by caring about some kind of insignificant minority of niche creators who care about latency in music making
* Besides, why are you such a whiner? Just buy a $35 dongle that takes up your only port like everyone else (at least, until we finally succeed in our quest to remove that port too and achieve seamless invulnerable portless design, a rock, and island, and an island never cries)
Shit, seriously? We use a first-gen big pro on our (electric) piano at home and being able to run midi into the iPad while also using headphone-out on the iPad is key for our use. I guess at least the new ones use USB-C so there are more adapter options so maybe there's a USB + headphone jack adapter out there, but 1) it's entirely stupid that we'd need one, and 2) if it's not a first-party Apple adapter we'd be rolling the dice that it works at all, let alone that it'll still work in six months—which is a big part of the problem with this dumbassed dongle-verse they've created, that most of the dongles on the market are awful so you can't always count on finding a version of the thing you need that actually works with your hardware.
Or I guess they expect me to buy Bluetooth heaphones, but... why? We have like three sets of corded ones two of which are quite good, and about a dozen pairs of ear buds. We don't need new headphones. We need a headphone jack.
> Or I guess they expect me to buy Bluetooth heaphones
...which adds another source of frustration to the setup - bluetooth latency. At some point I've tried setup like this: midi keyboard -> USB dongle -> iPad -> bluetooth speaker; the latency sucked. Not something you observe when you listen to music on Youtube, so not something that 99% users will notice, which means it will never get fixed.
Finding a usb-c adapter which has a good DAC may be a process in itself. I got several $20-$40 adapters from China, and they all have a very tinny, absolutely atrocious sound on their headphone jack.
I play at a local church sometimes, and they use ableton for their click tracks and loops.
They've had SO many issues with it. It has randomly died mid-performance, the tempo click has completely gone off the rails mid-performance, and the loops got transposed once into a completely different key mid-performance.
I use a 2012 MBP stuck on High Sierra. It has occasional issues when I record, but nowhere near the amount of issues that we experience at the church. I want to upgrade badly, but I really don't want to deal with those kind of issues.
Experiencing audio glitches all the time on my Mac. Never had any issues at all before installing Catalina.
After a recent Catalina update all the streaming video / audio stutters stopped only to be replaced with random dots appearing over my display - almost like my graphics card was dying. That’s now disappeared (after the most recent update) but the stuttering has restarted.
Running console.app As these issues occurred showed a number of errors that came up time and again. Bluetooth seemingly interrupting the audio, graphics drivers failing, disk access halting the system. Catalina is a poor product - probably the version of MacOS where I’ve experienced the most issues.
I've been getting playback glitches in Logic* with a 2020 MPB 16" running (as shipped) with Catalina.
It seems to only happen when using the built-in speakers. I never hear glitches using headphones (which is what I have as the default audio interface for Logic).
So I just use headphones all the time, but it's an incredibly negative experience to pop open one of your projects on your new laptop and hear this rhythmic popping. If it had also glitched in the headphones I might literally have thrown the laptop out the window.
* This isn't my post, but it's exactly the glitch I get in the built-in speakers: https://imgur.com/a/YPCmyiV
Catalina is such a tire fire. (Although, the Touch Bar helpfully suggested the emoji when I typed that.) It crashed multiple times a day hooked up to my external monitor. I was on the verge of returning it until 10.15.5 came out and seems to have fixed it (though it hasn't for everyone suffering from GPU-related kernel panics).
I've seen reports of pops/crackles in digital audio mixing/synthesis applications being fixed by disabling hyperthreading. This makes some sense if hyperthreading can sometimes cause extra latency for some threads to get real CPU work done, which is hard for the OS to strictly control. There have been a few different ways to disable hyperthreading on macbooks over the years, but recently Apple offers a good way (to mitigate recent Intel CPU vulnerability corner-cases for the paranoid): https://support.apple.com/en-gb/HT210108
> This makes some sense if hyperthreading can sometimes cause extra latency for some threads to get real CPU work done, which is hard for the OS to strictly control.
Threads requiring hard real-time responsiveness can (and do) request to be marked as such by the scheduler. And the OS is certainly aware of what hyperthreads-presenting-as-cores belong to which physical cores, and uses this information in scheduling. So I'd find it implausible that any modern OS (esp. one that already had good DAW support) doesn't already have logic to try to keep the other hyperthreads of a core on which a hard-realtime thread is running, as uncontended as possible, by either not scheduling, or very conservatively scheduling (i.e. "early" pre-empting) anything on that hyperthread.
I have a feeling it's something much more arcane. Maybe a hitch generated by the core switching "licenses" due to an AVX512 instruction.
That's weird, reliable low latency audio is one of the few things where I still find the Mac to be pretty solid.
I've got a 2015 15" and I've used several times a 2018 13", on both Mojave and Catalina and I have to say that it outperforms my 2015 (it can easily manage a 32 buffer with some plugins loaded, while my 15" starts to struggle under 64).
This was using Reaper with an Apple USB-A to USB-C adapter and a UMC204HD interface
I use Ableton with Focusrite Scarlett 2nd gen. I'm currently using 256 buffer. I've tried using 128 but it screwed up some vocal takes I did the other day (there are 5-10 seconds of cracks in the middle of each 4 minute long complete song take). Also I froze the other tracks (eight of them, so not that many) before starting recording. No FX on master track.
Maybe I should've bought 2015 MBP then. Or maybe I should downgrade to Mojave? Or buy the Apple adapter? Or a different interface? No idea; both Focusrite and Ableton say they are compatible with Catalina.
As I've said, the 13" 2018 I use from time to time can easily manage a 32 buffer, even on Catalina, so your device should be just as good..
My suggestion is to try with a different adapter and maybe with a different DAW such as Reaper, just to double-check (although I don't really think the DAW is the issue here).
It may also be related to the plugins you're using, maybe they're just heavier than mine..
Realistically how long can one expect to keep using a retina Macbook pro? I've had my mid 2015 since.. Mid 2015 and now I'm getting paranoid it'll up and explode on me out of the blue
I have a new OWC SSD upgrade so that should be alright.. Trying to keep it running cool without heavy software but the battery is at like 500 charge cycles and 70% capacity man it feels so scary getting old
Can you start playing audio, go to System Preferences -> Date and Time -> Unlock the panel -> repeatedly uncheck and re-check "Set date and time automatically" and see if that causes audio glitches? There's a known problem with that series of MBPs, I believe with the T2 chip. I'm still on the 2015 Macbook so I can't test it, but another audio pro I know says some recent Catalina update fixed it, and before that they did their live shows with it unchecked and didn't have a problem.
Thanks for the update - typed from my 2013 air with 32 bit ok os(10.13). Think I'll hold on.
One fun benefit of 2013 tech is you can by a bunch and give them to your friends for the same price as a 2020 model. I've now bought 4 and given away 3. It's good that they are reliable - many other laptops don't really last 7 years.
Could this all be dongle related? So much bad seems to get introduced by dongles and I have spent so much time debugging crazy issues that have been due to USB C dongles. Mine have almost all been monitor related, but at this stage there is almost nothing that I would be surprised to see fixed by switching/removing a dongle.
By dongles you mean just hubs or also adapters? Unfortunately, I also want to record vocals or external instruments for which I need a MIDI interface (which uses USB A); and I don't think you can buy USB C midi keyboard yet (actually Roli Block has USB C but I don't like it much and 95% of time I'd rather use normal keyboard). Is there particular dongle you can recommend? (By the way, my Focursite interface does not work with "quality" Aukey hub that I bought, I'm using some noname dongle for that).
You don't need a dongle to use regular old USB-A devices with a USB-C computer. All you need to do is replace the cable which is usually like $5. So if your MIDI keyboard or audio interface has a USB-B connector on the hardware side (like the Scarlett do), just buy a USB-B to USB-C cable. I don't know why the myth of needing a USB-C-to-A dongle has persisted for so long.
I think I am lumping them all together, but my bad experiences are all with adaptors (Eg USB C to HDMI and similar). At least half the problem for me is the USB C cable and the variety of things that can be plugged into the port that look the same but are not.
I have been scarred by a target-disk-mode recovery which required a new, non-Apple cable to work. It’s crazy that we have ended up here.
Audio is not something I know well, but I have yet to have problems with Belkin adaptors.
https://www.google.com/amp/s/www.macworld.com/article/291181...
Is it possible that a memory upgrade might fix the issue, assuming each new OS consumes more base memory and that the stutters are the side effect of VM pageouts or pageins to or from disk?
I record via interface (Focusrite Scarlett 2nd gen) and also have pop issues with it; when I used it with my old laptop it worked much, much better (on the same settings).
But wait, there's more! Sometimes I got glitches even when I'm not recording but playing tracks. I have this problem on literally the same Ableton projects that used to "just work" on the old machine (same number of tracks/plugins). The workaround is to restart the laptop. Seriously, it feels like Windows XP all over. I can't imagine playing live shows with such unreliable setup - fortunately I just make tracks for my own, but if I were to go out for some kind of jam session or open mic I'd consider switching to Windows.
EDIT: One more thing that sucks about new MacOS: dropping 32 bit compatibility. RIP all 32 bit VSTs. Fortunately I don't use that many plugins and it turned out everything I really care about has 64 bit version. But if it didn't, then I'd hold on to the old OS as long as possible.