Hacker Newsnew | past | comments | ask | show | jobs | submit | austinpena's commentslogin

are you caching the HTML response or just the assets?


I can't remember I think it's just the assets though - seems to be surviving so far. How do you get it to cache everything, is that a paid thing?


Page rules should do it


Page rules did it!! thank you


unlikely but they should check their invalid click rate to be sure


Invalid click rate is not always a reliable metric.

I've been dealing a lot with click fraud on Google Ads, and it's usually hard to detect it without special tools.


What’s your experience?

Off the shelf click fraud software (for search) has never been ROI-positive for me when I run in A/B tests.

Fou analytics is a fun tool though for social/native etc


Click farms that we had been dealing with for our clients did not render images on webpages during visits, therefore we put tirreno on the backend of the platform, plus added a 1px image that sends the same request to tirreno to spot the difference.

Page loading + no image loading = blacklist API.


A few things to determine if what you're experiencing is actually Google "being dead"

1. Check your search volume. Use Google Trends or the method I will share below. 2. Check how you spent in December vs how you spent during a previously great time. Understand if it's a volume issue or a conversion issue 3. See if anyone new entered your auction. If they did, find out what they're saying

-- 1a) Search Volume

Checking search volume: In the era of broad match, this is one of the most underrated approaches to diagnosing issues. Take a look at your `search exact match impression share` relative to your impressions on a few of your top keywords. Then measure out if search volume for your business is actually decreasing. Then, use the following rubric to diagnose futher:

1. Not decreasing. Move on to the next item 2. 5-10% decrease and competitive auction. If you have a decrease AND a competitive auction, a 20% drop in efficiency could be explained. 3. 5-10% decrease and a not-so-competitive auction. If this is the case, the drop in volume may not be what's causing your issues.

-- 1b) Click volume

Check your exact match impression > click rate. Similar to the last approach, this helps diagnose if there are SERP feature changes which could decrease the amount of clicks you're receiving despite demand remaining flat.

If this is the case, take a look at the SERP and find the new winners.

-- 2) Segment comparison

Compare December YOY and see what changed. Are you serving to a different age range? Different search term mix? Increased spend to search partners? Are the headline combinations which are serving different?

-- 3) Auction changes

Have you checked your auction insights? Are new competitors being more or less aggressive? If so, what are their headlines? Are they offering an easier booking experience than you are?

And... if Google is actually dead, you might try:

1. Meta ads. Turn off audience network, make sure you've got the conversions API set up, and see what happens. Expect leads to be lower intent. Make your creative dead simple. "If you're looking for kid party entertainment in Northdene..." Start with $20/day optimizing for leads.

2. Improve your form. I see typeform-style-forms do better than the long one you have.

3. (Maybe) If you don't already track `closed (won)` conversions into your google ads account, that could help. I find when I start tracking which searches turn into deals, I can restructure my account to de-prioritize the junk leads.

4. (Maybe) Add a soft form to each of your service pages. Basically an embedded form which starts by asking people softball questions like "How Old Are The Kids At Your Party." Once people start a form they're much more likely to complete it, even if the questions are very basic.

5. (Maybe) Add a way to give a phone call. Phone call leads convert 30-50% better in my experience. But, this isn't an option for every


Thanks for the info. Will definitely be doing a post mortem after I finish scrambling for new work opportunities


Or Like Dave App or Chime.

Fun fact about Dave App's tipping. If you bring the value to zero you saw an animation of a kid's food being taken away from them.

https://www.ftc.gov/news-events/news/press-releases/2024/11/...


Wow, that FTC case gets worse and worse the farther into it you read. What a scumbag company. It's like they opened the "Dark Patterns Unleashed" book and followed every example.


I have an SSR Astro project. Using Fly makes my project fast.

For dynamic data I use SWR.

I could use Cloudflare workers but it doesn’t play so nice with Astro.

I also have a “form submission service” where I receive a Post and send an email.

I need maximum uptime to avoid revenue loss.

It’s a go service so I deploy ~6 machines across the US to ensure I don’t drop any requests.

I haven’t had downtime in years.


Reminds me of https://www.lucidlink.com/ for video editors. I quite like the experience with them.


That's exactly right, I've spoken with a ton of folks who have had a good experience with Lucid Link. I think that we are in a slightly different part of the market (in that we aren't targeting video editors, and more of data-intensive applications which may use thousands of IOPS), but I appreciate that the technology is likely similar.


M4 Mac Mini with 16GB RAM is doing a "good enough" job of editing 6k raw footage in Premiere for my team. I'm surprised to say I'm content with the 16GB of ram so far.

Edit: This is in contrast to my M1 Macbook Air with 16GB of ram which would stutter a lot during color grading. So definitely feeling the improvement.


Macbooks Air thermal throttle, thats why the Air with 16gb is an issue and the mini isn't, no fans = throttling at heavy loads, its not a ram issue.


I bought the first MacBook Air M1 with 8GB because it was the only option available in my area. Initially, I had doubts, especially after using notebooks with more than 16GB of RAM in previous years. But I was genuinely surprised by how well the M1 performed. My takeaway is that there’s a lot of room for similar improvements in Linux!


I have an m1 as well.

And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.


> And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.

This isn't the early 2010s anymore - SSDs last "long enough" for most people, to the point they are no more consumable than your motherboard or your RAM. (I've actually experienced more RAM failures than SSD failures, but that's an individual opinion here.)

And for the downvoters - do you remember the last time you handed in your Steam Deck, Nintendo Switch, iPhone, or even laptop specifically for a random SSD failure, unrelated to water damage or another external cause? Me neither.


people really don't grasp that the slowest SSDs are still 3-5x faster than the fastest HDDs (including SAS drives. Yes, the dualport kind).

And, Looking at the anandtech review of a vertex 3 way back in 2011...


I'm still very happy with my 8GB Air M1 as well. It's incredible how well it still works for a 4 year old entry level laptop. I see all these new M's come out, and I'm sure they're fantastic, but I'm not at all tempted to upgrade.


Yeah, I don’t know why 8gb base models get so much hate online. 8gb is 64 billion bits of memory. If you’re writing everyday software and you need more memory than that, you’re almost certainly doing something wrong.


Seriously?

What if you want to have a few browser tabs and a spreadsheet open? Or containers?

My M1 routinely rests around 22gb of RAM.


I also use an 8GB M1. It has firefox with many tabs & windows open in OSX and also a Linux VM in UTM which is running VSCode, vite, and another firefox with lots of tabs. It's performing well! (although swap is currently at 2.3GB, and there's a further 3.5GB of compressed data in RAM)


How much RAM should a few browser tabs and a spreadsheet use? Spreadsheets and webpages were both invented at a time when computers had orders of magnitude less ram than they do today. And yet, Excel and Netscape navigator still worked fine. It seems to me that bigger computers have caused chrome to use more memory.

If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?

That doesn't help you right now, but 22gb is ridiculous for a few browser tabs and a spreadsheet.


> If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?

16gb is the figure for the next 10 years. If you see yourself being content with 8gb of memory shared between your CPU and GPU in 2030, you must have a uniquely passive use-case.

I remember when people said 4gb doesn't need to be the minimum for all Macbooks. Eventually MacOS started consuming 4gb of memory with nothing open. Give Apple a few years to be insecure about the whole AI thing and they'll prove to you why they bumped the minimum spec. Trust me.


I'm not doubting that modern macos, chrome, firefox, spotify, etc are giant memory hogs.

My claim is that the developers should fix their shit, and stop making their laziness something for me to solve by buying more RAM.


It’s not just for tabs and spreadsheets, I also have an ide, containers, etc.

I do think the memory footprint of many applications has gotten out of hand, but I am more than willing to spend the extra money not to have to think about it.


This doesn't necessarily mean that your workload would perform unacceptably on an 8GB model. It just means that fewer optional things would be cached in RAM, more RAM pages would be compressed, and there'd be more swap usage.


Using Chrome I'd guess?



Mine are fine.


I'm very grateful to this post for introducing me to sliceutils to create a map from a slice. I think that's a very elegant way to create nested models given a parent and child struct.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: