Hacker Newsnew | past | comments | ask | show | jobs | submit | pierreminik's commentslogin

I’ve had to do help a client in a similar position. We got contact with the domain registrar’s phone support, took over an hour, “proved” we were the legitimate owner of the domain with credit card details along with other personal information the registrar knew but was not public. They changed the recovery email of the domain account temporarily so we could login and get control back.


And the biometric check is still password based, right? You can't use Touch ID until you've entered your password, and then it's only a convenience for a given time.


There's a joke in Greenlandic, closely related to Inuktitut, about polysemy. The joke stems from Greenlandic having very little polysemy when it comes to hunting, nature, practicalities and activities, while European languages is quite a bit more polysemic in nature.

I don't know how well this joke translates into English but it works very well in Danish. Here it goes:

> A Danish police officer gets called on the radio by a Greenlandic hunter who has been in an accident. The hunter tells that his partner have fallen into the water and have been pulled up again but might have died from the freezing water. The officer tells the hunter to "make sure he is dead". Over the radio the officer hears a riffle shot and the hunter replies: "There, he's dead now for sure".

The design of Inuktitut and Greenlandic is very in tune with nature but I agree that it doesn't mean you can absorb it's qualities into other languages. That said, doesn't mean you can't learn anything from them, as the author fo the article claims.


Denmark is geographically a relatively small country, and it is not uncommon for children to travel alone across the country in dedicated trains[1] for the children during the weekends.

That said, its really not uncommon for other family members besides grandparents and even friends of the family to take care of your children in Denmark.

[1]: (In Danish) https://www.dsb.dk/find-produkter-og-services/dsb-borneguide...


I saw the Joe Rogan clip[1] and I agree that Michael Osterholm's analysis of this seems correct when it comes to the United States.

Denmark however has a completely different structure socially. All private sector employees who can work from home are urged to work from home. All public sector employees who are not working in any matter-of-life-and-death function are forced to stay at home. The public sector employees will still get paid despite not working. Practically this means very, very few cases of health care workers with children needs to be home supervising the children.

[1]: https://youtu.be/cZFhjMQrVts


Wait, how would this prevent health care workers with children from needing to be home? Who is going to be watching those kids while the parents are working at a hospital?


There might be some cases where watching the kids for them is difficult, but most likely the local hospitals already have an idea for this for the minority of employees whose single parent or both parents work in healthcare. Usually their partner can help out.


Emergency daycare facilities will be available for those that have no other options.


How about keeping the daycares open for the children of those in healthcare and only for them? Still a big spread reduction, zero healthcare side-effects. It would be very difficult to enforce because so many others would think that they deserve an exception as well...


Most children have two parents. The few cases where both parents work in health care they most likely have immediate family and/or friends who are either public workers or private sector workers who can work from home.


The parent thats not working at a hospital


Does anyone know Yahoo Small Business' support phonenumber?


I have tried tweeting to @YSmallBizCare and I really hope they have a solution...


Please, tell me this is Adobe trying to mock IE. It has to be, right? I mean, they've always sort of not valued source code, but this is beyond torturing it.

This is so inspirationally horrible someone spent hours remaking it, like it should've been: http://studentweb.infotech.monash.edu/~wlay/FIT1012/muse-dem....


Indeed. The inefficiency here is astounding, especially considering the relatively straightforward page layout involved.

Original: 1496 lines, 77.9kB

Your version: 104 lines, 4.75kB

I'd thought we'd progressed beyond the state of a decade ago where Dreamweaver or what-have-you would build you a cumbersome and baroque html splooge to match whatever you had done in the designer, but I guess we haven't advanced that far. Just goes to show you that front-end devs are still as necessary as ever I suppose.


How about we've progressed to the state where HTML is the bytecode you don't want to see anyway, and designers can use modern tools to manipulate it? If the generated markup works, cross-browser and cross-platform (I don't know to what extent it does, but let's assume so), then what's the problem?

For many purposes, optimizing the HTML nerd out of the process is a much bigger win than a 20k download (don't forget gzip) is a loss.

I know this is going to get me downvotes, but I think the dogmatic "HTML shall be written by hand!!1" attitude all over this thread is just people clinging to the past.


In this case it's a 78k download, of just text. That adds up over enough users. It adds to page load times, it adds to page rendering times, it adds to bandwidth costs.

And then there are the higher level issues. What happens when you get a bug report about how the page is rendered in a specific browser/OS? Do you want to wade through 1500 lines of html or 100 lines? Which do you think will be easier and faster to fix? Which do you think will be easier to inspect for correctness from the start? What happens when you need to figure out why your page is rendering too slowly? Which is easier to analyze, which is easier to speed up? What happens when you want to change the design? What happens when you want to take the design and use it as a UI for a web-app?

Using a tool that generates such crappy HTML may allow an inexperienced person to create a web page with a decent appearance, and it may even save an experienced designer a few minutes upfront. But over the lifecycle of a project it ends up being an enormous drain. If you're an enthusiastic teen putting up a web page for your mother's knitting circle, it's fantastic! But this is not in any sense a truly professional tool.


Quite. It all adds up and it's unnecessary. It's not green. It's wasteful. It's not elegant. But I fear it is the way things are going, because I've seen this before -- I used to write games in lovely, pure, beautiful assembly language, then folks started using C etc because it was easier and, hey, hardware like the Amiga could still run the games fast anyway. And then the hardware got to a point where it was all so complicated only a madman would contemplate assembly language...


I think you are not considering the many websites for which this inefficiency doesn't matter. Sure, if you're write Google's main web page, every extra byte counts. But there are not many web sites where this is even close to be true.

For example, there is no big difference in functionality if the Adobe web site takes a few extra milliseconds to load. Customers just want to buy a product and get out of the way. And Adobe is a big company... This is even more true for their customers, thousands of small websites that just want to publish content as quickly as possible.


We aren't talking milliseconds, we are talking seconds to 10s of seconds. At that point, you are risking bounces, which any business-oriented site should fear, even Uncle Bob's Burger Bar.


You're forgetting that when several thousand people access a site per second, even a 1kb difference means thousands of dollars of bandwidth cost and fractions of a second of download speed.... let alone 70KB of difference which would bankrupt your company and turn all of your users away.

We don't write HTML/JS/CSS by hand because it's fun, that's for sure. We focus on code-reuse and delayed loading plus AJAX (plus gzip, compress, CDN, cache, etc.) because the customer focuses on speed and the CEO focuses on the bottom line.


This.

And, not to mention no business owner want to get locked into a solution, they want their "data" transferable and standardized to at least some extend.

The code this produces is so horrible you've lost all the time spent with it. Muse dies out, and your time spent in it dies with it.


  And, not to mention no business owner want to get locked
  into a solution, they want their "data" transferable and 
  standardized to at least some extend.
That's a weird thing to say. Why do many business continue to use things like MS Office - or rather, its fabulous file formats then? Or other vendor-lock-in products? Don't misunderstand me, I'd wish all businesses would think like this, because open standards are infinitely superior to vendor-specific file formats. But unfortunately, most don't.


Maybe - but nobody uses Front Page, and for good reason. The tools need to reach some critical point of utility and dependability before business will make single-vendor investments.

Still, it absolutely defies reason that a business would need to hire someone specifically to edit a bit of text or change a font on a web site. Once a credible product along these lines hits the market, the terms HTML and CSS will disappear from the overwhelming majority of job descriptions, forever.


That's only weird to those who think open standards to be the only viable standard. MS Office is the de facto standard and your data is not vendor-locked when using it. Please provide link to a somewhat decent Word/Excel/... alternative which doesn't at least let you read the document and save it in their own format.


  That's only weird to those who think open standards to be
  the only viable standard.
Open standards are the only viable standard if you want adaptable, future-oriented and collaborative software ecosystems as well as likewise markets. You simply cannot guarantee or even create these circumstances with standards that are set by a single corporation (or worse, a trust) - which is only logical because they were designed to do the exact opposite ('defective by design').

  MS Office is the de facto standard
I wasn't arguing that. My point is that this is bad and needs to be replaced.

  your data is not vendor-locked when using it
I think you somewhat misunderstand the term 'vendor-lock'. Sure, you can open Office files with other programs and convert them into open file formats, such as odt.

This is, however, mostly thanks to people reverse engineering Microsoft's original binary file formats, and MS was not really happy about this to begin with. If they could have prevented it, they would have done so (and they tried). Even the newer OOXML is not entirely documented and prevents free implementations due to patents (which, no matter what Microsoft may claim, is the exact opposite of an open standard).

Also, while this conversion might work fine for simple, small documents (or other files), the more complex and larger your filed become the more impossible it becomes to convert without a major hassle, which brings us back to your misunderstanding of 'vendor-lock'. The terms doesn't necessarily mean that it's impossible to switch to alternatives, but also applies when measures are taken to make it as hard as possible to switch without investing heavily in time and money.

As a side note, I am not attacking MS Office specifically. It's just the best example for showing all that is wrong with closed standards and proprietary file formats.


I'm not preaching morals, I'm taking business. Business' care not for fair software but their investment which is why I used the "somewhat open standard"-wording.

In the history you're clinging to you completely ignore that there was no real alternative. Open standards weren't in a viable state. Furthermore the competition from closed standards have forced open standards to shape up.


Completely off-topic, but - like it or not - business runs on Excel. I certainly don't because I have to deal with it on a day to day basis. There's simply no convincing them to move to another format. Excel is the defacto "move numbers and data around" file format in the organization I work for.


If I had to bet, I'd say most companies using a product like this are not looking to build a site that'll be around for years on end. Most of these sites probably have a shelf-life of two years at most.


It's not a matter of clinging to the past.

I definitely want and wish for tools that can allow me to design websites without having to know what IE developers were thinking the day they decided to have some fun.

The reason vim, emacs and the likes are still used a lot isn't because of nostalgia but because it works great for the user.

The issue with all the WYSIWYG editors is they take the "a web site can be anything and everyone can be a designer"-approach which is completely make believe. Till they make some form of WYSIWYM editor aimed at the web designer to use, all other approaches are worthless because they are basically just Word that's fancied up.

Most graphic designer classes have a field trip to see how typographers worked for newspapers 60 years go, and learn why their job was important - and such a day will come for designers some day too when it comes to writing code in hand.

But till then, design and code still matters.


Mobile users account for more than 15% of my company's corporate website (for reference purposes, IE is 26%, firefox 35%). Making a webpage 10 times bigger/taking 40 times longer to render isn't really an option.


Downvotes are deserved. You may not want to see HTML, but the thing is, it impacts performance of the page. Client side performance has direct and measureable business impact. Also, let's not forget, that more and more people browser the web on their mobile phones. Some still pay a lot for bandwidth or have a traffic cap, so every byte counts there. Also, so far mobile phones are pretty restctricted in resources, and if you go over some limit compontents of your page won't be cached and browser will have to redownload them, then see the first point. If you use any kind of DOM manipulation having gaziilion of unneeded elements will slow script down a lot. Once again, on the phones and tablets that is even more important than on desktops. Others already mentioned accessibility. There should never be ecscues for a sloppy code not to metion a horrible code like this.


First of all, it takes almost no time at all to learn HTML/CSS. I'm willing to bet that it takes just about as long as it takes to learn how to use all the controls in Muse. Supporting IE6 is the most time consuming aspect of cross-platform web support, and there is plenty of precedent for that available freely online. ASM is a not so easily learned and understood as HTML/CSS.

Second, HTML is for telling the browser how to lay out content. When you use Muse or Dreamweaver or iWeb or whatever, you're essentially "scripting" your HTML in a proprietary GUI. When that GUI changes or disappears, how will you maintain that page? Yes, by hand.

All WYSIWYG GUIs should seek to output human-maintainable code, at the very least. It isn't a performance issue.


And how much is it compressed? Let's see...

  ~ $ curl -s 'http://studentweb.infotech.monash.edu/~wlay/FIT1012/muse-demo/' | gzip | wc -c
    1741
  ~ $ curl -s 'http://muse.adobe.com/index.html' | gzip | wc -c
   11521
Hmm, so 11.5kB vs 1.7kB with gzip, which I believe browsers can usually handle. That's a factor of 6.6, incidentally. I don't know much about this, but might it still be ok?

I thought to check this because of ridiculousfish's old article (note the FAQ at the bottom, "Isn't that a humungous flood of markup?"): http://ridiculousfish.com/blog/archives/2009/06/01/roundy/#f...


Keep in mind that those connections are all going to be stuck in TCP slow start (gradually improving) for the duration of their communications, meaning that the smaller version gets loaded in 2*RTT, as it will probably fit, in it's entirety, in 2 packets. The larger one is going to take 8 or so packets, meaning a lot more RTTs (probably at least 4, assuming aggressive TCP tuning, and possibly 6. Since RTT can easily be 120ms, these can be substantially larger load times, and that can make a huge impact to user impression:

   Even small changes in response times can have significant effects. Google found that moving from a 10-result page loading in 0.4 seconds to a 30-result page loading in 0.9 seconds decreased traffic and ad revenues by 20% (Linden 2006).
edit: It has been pointed out below that rfc2581 is going to mitigate this somewhat, and they are absolutely right, although I don't know what the implementation levels on this are in the real world, so my observations above may be obsolete for newer OSs.


Basic RFC 2581 should hardly count as an aggressively tuned stack, and even it'd do the larger file in 4 roundtrips (syn/synack, req/2 packets, -/4 packets more, -/8 packets more). These days even an initial cwnd of 10 might not count as aggressive any more, given it's the default initial cwnd on recent Linux kernels...


That factor is exacerbated on a smartphone. Plus, some of them pay for bandwidth.


As long as tools like Dreamweaver exist, there will be a need for good front-end developers to clean up the mess.


What a shameful waste of a good front-end developer's time.


"What a [lucrative] waste of a good front-end developer's time."


You couldn't pay me enough to _maintain_ that mess. Not even a unit test in sight if something breaks.


Hopefully this issue will get back on the table with increasing mobile internet usage, and people bouncing off because of poor load times. I cant count the number of times I have to hit the back button just because it takes too long to load.

I really do not want to see that 1mb picture of your dog mascot, but i was actually interested in your 5kb product description... Oh well if you dont care about your site, you probably dont care about selling your product either.


P.S. The big background image is only about 49kb...


To add insult to injury, the layout is broken in chromium on Ubuntu. Whereas the remake you posted is running as expected.

http://i.imgur.com/otaOc.png


Speaking of which, "Adobe Air is not available for your system." :-P


True, I have not come to expect anything else, so I run a WIN7 virtual box, no biggie.


Dreamweaver required you to know quite a bit about HTML. Muse is like InDesign/Photoshop, which people use to do mockups of designs (not UIs/wireframes) today. It cuts translation to code from the workflow. That's a meaningful shortening of the feedback loop for a designer.

As for the rest of us, we don't have to be concerned till Muse starts learning the quirks and features of CSS faster than ourselves.


The original does have at least one advantage: preloaded hover images. It's nice for them to improve usability a tiny bit, along with all the bloat. I can't imagine a complete newcomer to HTML and CSS hacking together a page and having it come out nearly as bad. Any designer who can learn design can easily learn to code better than Muse, in my opinion.


Sprite sheets?


I can't tell whether you're suggesting a solution or taking a guess as to what Muse did. I think it's the latter. Sorry if I interpreted your question incorrectly. :)

They're using a hidden div full of <img> elements to load the hover images before they're requested by an actual hover. It's all the way at the bottom of their code code: <div class="preload_images"> [Removed for brevity] </div>

Sprite sheets are another option (using the sliding doors technique), but they're a bit more ungainly. They would save a couple HTTP requests, but that extent of optimization isn't necessary on most sites. Unless I've already combined all my stylesheets into one file, I certainly wouldn't start combining images.

What really matters is perceptible lag to a user, and either technique work just as well for that.

I find Adobe's technique kind of neat, and I'll probably use it in some of my future websites.


The windowing technique isn't ungainly when its done automatically for you a la SpriteMapper :)

http://yostudios.github.com/Spritemapper/


Or you could inject them into the Dom so they load, but don't exist anywhere on the page..


Thanks for that. I know a little about ancient-style table-layout web dev, but didn't know what the "proper way" of laying out that site would have been.


Seriously, you have to learn how to use css sprites...


The "connection between things that are not really connected" part match my own experience of having schizophrenia very well.

It's not uncommon for me to have thoughts that very much makes sense and is meaningful to me while others simply can't make any sense of it what-so-ever.

This means I'll have to be cautious of my own thoughts and I often doubt them.

Thoughts as well as meanings change over time too. Sometimes I write down ideas I think is brilliant to save for later. Revisiting those writings can be troubling because I might not be able to make a similar sense of what I wrote earlier.


Having researched potential PoS systems for a local shop I can say there is a lot of room for improving the PoS products but I'm far from convinced it's actually lucrative to provide such solutions.

As rick888 is pointing out you will definitely need a solution that is forgiving towards system threats. The system needs to be functional in all sorts of conditions.


I was looking at a suscription model, kind of like what the 37 signals guys do, but with more affordable pricing. I was thinking something like $20-$30 a month for the basic package.

Thank you for your pointer about functionality and system threats, I hadn't really thought about that and will be looking into it very thoroughly.


In my case suscription models with somewhat absolute pricing were attractive because it provides a way to even out the money cost over time.

But the price was not the only thing in consideration when it came to costs. The concerns my client had with suscription model was if the provider would still exist and provide same service in a year, or in five years for that matter. The amount of time they had to invest for the PoS (which was with inventory management) needed to be secured. They were okay with phasing in the inventory over a long period (it was a specialty store with a huge inventory) given the system didn't rely on being functional if only it knew the inventory - as long as the time investment couldn't be lost. Should the PoS provider go bankrupt or otherwise stop the service they wanted to be sure their effort time-wise wasn't worthless and could be ported somewhere else. Contracts with a minimum binding periods was okay as long as it didn't mean the time they used the system couldn't benefit them elsewhere.

Someone pointed out that the robustness wasn't a deal-breaker because you could revert to handwriting receipts. It's true that you can just do that but the point about PoS isn't just that it's slightly more effective than a calculator that can print out receipts for customers. A PoS grants valuable data. Even older cash-registers have data capabilities such as who the salesperson was and time of purchase. If you can provide great analytic tools for the data it has gathered, that'll be a major improvement of many available PoS systems.

But store owners also care about stability very much. If the system doesn't work when there's internet shortage they won't be happy. No electricity is probably only acceptable factor when it comes to having a PoS on a computer. If the system is slow and customers are waiting because there's unstable internet they will also be dissatisfied. Debit card processing all goes over the internet and when it's slow customers have bad experiences but at least they know it's because the internet is messing up and some ISP probably is to blame. If the cash-register is screwing it up they will blame the store.

I've worked in a store for a couple of years as a sales person so if you have more questions I'd be happy to help. I'm available through email: pierre@snowboardforbundet.com.


Hi Pierre, thanks a lot for your comment, you raise some very valid concerns about the application that I will have to take into consideration, and will definitely hit you up via email in the next couple of days.

I guess I should start considering some options so the shops can have an offline app, or some sort of offline synchronization options at least.


Why the downvotes? Too pricey?


Not sure why someone down-voted that... But it doesn't sound too pricey as long as it does what it needs to do and it does it well. 20-30 dollars is probably what most store owners pay just for being able to accept credit cards. In my country the credit card terminal alone costs the same as a mid-end desktop computer and you still pay for the service to be able to accept credit cards and in some cases fees for the processing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: