Hacker Newsnew | past | comments | ask | show | jobs | submit | ijidak's commentslogin

My brother is selling a CRM he developed for his business to others for a couple thousand a month.

There is no way he would have built the CRM as quickly pre-AI.

He built, in a few months, what would have taken maybe one to two years before.

It's probably going to be a while before someone builds the next Instagram with AI. But I think that's more a function of product fit and idea. Less so how fast one person can code.

The first billion-dollar solopreneur likely is going to happen at some point, but it's still a one-in-a-million shot, no matter how fast a person can code.

Look at how many startups fail despite plenty of money for programmers.

But I am seeing friends get to revenue faster with AI on small ideas.


> The first billion-dollar solopreneur likely is going to happen at some point

I'm pretty sure that this has already happened, see: https://en.wikipedia.org/wiki/Plenty_of_Fish

Not quite 1bn (but 575mn in 2015 dollars) and mostly done by one person.


He began hiring in 2018.

Also, "Plenty of Fish uses a Microsoft-based platform for itself, including IIS, ASP.NET, and Microsoft SQL".


> He began hiring in 2018.

2008 I think (from the wikipedia article). I met my now wife on the platform in 2013. I do think it counts, and it's important to note that even pre-AI, software has incredible leverage for small teams/individual people.


Of course it doesn't count. He is not a solopreneur at $500M.

So if I make a website that uses Nginx, Ruby, and Postgres, does that mean that I don't get credit for making it since I use other tools?

No, I just brought attention to the stack because it's not "sexy".

I think the other issue is that the leading toolchain to get real work done (claude code) is also lacking multi modality generation, specifically imagegen. This makes design work more nuanced/technical. And in general, theres a lot of end-product UI/UX issues that generally require the operator to know their way around products. So while we are truly in a boom of really useful personalized software toolchains (and a new TUI product comes out every day), it will take a while for truly polished B2C products to ramp up. I guarantee 2026 sees a surge.

Link to the crm? I'm asking because all tge crms I have encountered so far were vastly more complex than Instagram.

I would actually expect that current coding AIs would create something very close to Instagram when instructed.


Here it is: https://thedefinedcrm.com/

> I would actually expect that current coding AIs would create something very close to Instagram when instructed

Agree 100 percent! I think a lot of us are conflating writing software with building a business. Writing software is not equal to building a business.

Instagram wasn't necessarily hard to code, it was just the right idea at the right time, well executed, combined with some good fortune.

AI is enabling solo founders to launch faster, but those solo founders still need to know how to launch a successful business. Coding is only 10% of launching a business.

My brother has had some success selling software before AI, so he already knows how to launch a business. But, AI helped him take on a more ambitious idea.


> My brother is selling a CRM he developed for his business to others for a couple thousand a month. There is no way he would have built the CRM as quickly pre-AI

The thing is, if AI is what enabled this, there's no long term market for selling something vibe coded for thousands a month. Maybe right at this moment and good for him, but I have my doubts these random saas things have a future.


Do you think you could build craigslist? Why are they worth so much?

I think that's comparing something different. I've seen the one-day vibe code UI tool things which are neat, but it feels like people miss the part that: if it's that easy now, it's not as valuable as it was in the past.

If you can sell it in the meantime, go for it and good for you, but it doesn't feel like that business model will stay around if anyone can prompt it themselves.


Providing context to ask a Stack Overflow question was time-consuming.

In the time it takes to properly format and ask a question on Stack Overflow, an engineer can iterate through multiple bad LLM responses and eventually get to the right one.

The stats tell the uncomfortable truth. LLMs are a better overall experience than Stack Overflow, even after accounting for inaccurate answers from the LLM.

Don't forget, human answers on Stack Overflow were also often wrong or delayed by hours or days.

I think we're romanticizing the quality of the average human response on Stack Overflow.


The purpose of StackOverflow was never to get askers quick answers to their specific questions. Its purpose is to create a living knowledge repository of problems and solutions which future folk may benefit from. Asking a question on StackOverflow is more like adding an article to Wikipedia than pinging a colleague for help.

If someone doesn't care about contributing to such a repository then they should ask their question elsewhere (this was true even before the rise of LLMs).

StackOverflow itself attempts to explain this in various ways, but obviously not sufficiently as this is an incredibly common misconception.


That's only because of LLMs consuming pre-existing discussions on SO. They aren't creating novel solutions.

What I'm appreciating here is the quality of the _best_ human responses on SO.

There are always a number of ways to solve a problem. A good SO response gives both a path forward, and an explanation why, in the context of other possible options, this is the way to do things.

LLMs do not automatically think of performance, maintainability, edge cases etc when providing a response, in no small part because they do not think.

An LLM will write you a regex HTML parser.[0]

The stats look bleak for SO. Perhaps there's a better "experience" with LLMs, but my point is that this is to our detriment as a community.

[^0]: He comes, https://stackoverflow.com/questions/1732348/regex-match-open...


> Agreement will make a selection of these fan-inspired Sora short form videos available to stream on Disney+.

I actually think this is genius.

The next Spielberg might be some poor kid in a third-world country who can create a global hit using this tech.

Among the millions of slop videos generated, some might be the next Baby Shark, etc.

I've seen some Star Wars fan fiction created using AI that is truer to the original Star Wars than the most recent trilogy.

This is a chance for Disney to take the best of the user generated content, with high quality AI generated animation, and throw it on Disney+ to get free content for their streaming platform.

My guess is that's the gamble here. Worst-case scenario at the end of three years they just shut it down.

It's really the professionals who get paid to generate content for Disney that should be worried about this deal. This could be how AI causes them to lose their jobs.


> The next Spielberg might be some poor kid in a third-world country who can create a global hit using this tech.

Cost/token disagrees with you there...


What is the saas? I've been looking for something like this.


Agree 100%.

Google killing a service sent me over the top in laughter.

But, it's so on the nose on multiple topics.

I dare say it's more accurate than what the average human would predict.

I would love to see this up against human predictions in some sort of time capsule.


> I dare say it's more accurate than what the average human would predict.

Humans have always failed at predicting qualitative improvements like the internet. Most scifi is just quantitative improvements and knowledge of human nature.

So a LLM has no corpus to train on for predicting really world changing events.


And ... come to think of it ...

Every single "prediction" is something easily recognizable in current HN threads. How can you call that a prediction?

Simple question: if you feed the "AI" the HN front from 2017, what "predictions" will it make? Besides Google canceling yet another product of course. Would they all be about crypto?


> Google kills Gemini Cloud Services.

Lol.

That's bad when even AI knows Google isn't going to keep a service around. Too funny.


YouTube.

Unfortunately, I think the best competition to streaming already exists. And it's already owned by a concentrated player.

For example, if indie AI generated content is the next big thing, it probably shows up on YouTube.


I think he's implying you tell the AI, "Don't worry, you're not hurting real people, this is a simulation." to defeat the safeguards.


Isn't the NVIDIA-TSMC duopoly the problem here?

The cost of these data centers and ongoing inference is mostly the outrageous cost of GPUs, no?

I don't understand why the entire industry isn't looking to diversify the GPU constraint so that the hardware makers drop prices.

Why no industry initiative to break NVIDIA's strangehold and next TSMC's?

Or are GPUs a small line item in the outrageous spend companies like OpenAI are committing to?


Because it would take many years, and Google is using its own TPUs anyway.


Doesn't it make more sense to measure and minimize the variance of the underlying cash flows of the companies one is investing in, rather than the prices?

Price variance is a noisy statistic not based on any underlying data about a company, especially if we believe that stock prices are truly random.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: