There are 3 dimensions to this answer:
1 - The company has to prove that your skill is unique, and they can't find anyone else. Mostly this is covered by a good description of your qualification and the task at hand
2 - There has to be enough H1B left: Every year, the US issues a certain amount of H1Bs. If they run out, you have to wait for next year.
3 - The company has to be well funded: to prove that they have enough funds to pay you for the duration of your H1B (typically 3 years). If the funds of the company don't suffice for a 3year employment (early start-ups), then you could also apply for a shorter term H1B (let's say 1 year) and then extend once the funds of the company are sufficient.
In any case, I'd highly recommend to talk with a good immigration lawyer. There are many exceptions, alternatives - depending on your country, skill-set, etc.
Not quite convinced by the argument that 'Apple killed' it. IMHO it's always dangerous to go for one plug design; given how those plugs evolved over time (Think USB, micro USB, mini USB for Android, etc.). I quickly amassed in my drawer a plethora of unusable chargers. Only the Apple port had the longest lifetime across multiple devices. I would say, the kickstarter idea could use it as an opportunity to redesign an initial flaw of - an otherwise nice charging station ?
They can't. The point of it was to be a charging station that could charge everything. Apple's updated more-anti-competitive license changes make this impossible by forbidding anything from having a lightning connector in conjunction with any other connector. It's basically Apple forcing 3rd parties to make docks, speakers, etc that ONLY work with new Apple products and won't support anything else.
They could have simply provided a standard USB-A port. Trying to put all possible connectors on the thing sounds like a fools errand, regardless of Apple's stance.
If you look on their kickstarter page, they actually did provide USB-A ports on the bottom of the unit. They just weren't able to 'compromise their product', which makes me believe that perhaps they had other underlying issues, unrelated to Apple's licensing.
There are already a glut of portable chargers on the market that provide a standard USB-A port. The point of this device was to stand out by providing the Lightning (and other) connectors without needing additional cables.
Wait, why does Apple even have the power to do this...?
It's a connector... Why on earth is it possible to patent a connector that doesn't involve any particular innovation (take wires, rearrange a bit, tweak shell shape) in the first place...?!?
Well, it's not really drm -- the chip in there is used for lots of stuff. There was an article about the thunderbolt cable (which has a similar design to lightning) where it has a chip inside that cable as well. Note that thunderbolt was actually designed by intel.
I think lots of future cables will have these chips in them. In thunderbolt, the chip mutiplexes & demultiplexes the data. Perhaps the idea is that speed increases can be had just by upgrading your cable instead of upgrading the port.
In the lightning cable, the chip is also responsible for determining the orientation of your connection, since the connector is reversible.
The pins are on the outside of the connector. It's trivial to short-circuit them by accident, which could easily cause a fire if there were not a chip in there to control when you "shove 5V over the wire".
Doesn't DRM stand for digital rights management? I think you may have confused it with something else. DRM doesn't prevent people from manufacturing cables -- it's to prevent the end user from accessing content they don't own. DRM was never aimed as a way to reduce the number of manufacturers out there. In fact, DRM wanted to increase the number of manufacturers so that there would be less hardware that could play media that you didn't buy (pirated content).
DRM prevents digital data/signals from going where an end-user wants them to. So you cannot play DRMed media on a open-source player, but only players blessed by the DRM-vendor. The end-user has his choices artificially limited.
And now the end-user cannot use any cables he likes. Only those cables blessed by the proprietary connector vendor. You have authentication where none is needed, exclusively to give the vendor power, not to provide the end-user with benefits.
Both are about digital data/signals and having artificial restrictions imposed on them. I think the similarities are good enough to warrant the name DRM.
To the layman it may appear that way, but I assure you the lightpeak cables don't have any DRM logic in them, they merely mux/demux signals. I think you're just a bit misinformed, or think anytime there's chips in basic things like cables, to consider them DRM devices. I suggest you wikipedia DRM to brush up on your definitions.
If there are more adopters of lightpeak, there will be other cable manufacturers, and users can buy any cable they want. People who don't understand electronics also wouldn't understand how this might benefit the user.
Personally, I think lightpeak is a fairly interesting way of implementing something, it enables ports to be a lot more capable without having to replace electronics on the motherboard.
Perhaps users who don't fully understand what is going on inside would prefer to think that it's something evil...
I haven't seen the connector myself, but I would not expect it to be devoid of all innovation. But I do find it somewhat disturbing that after 100+ years we're still finding patentable modifications to ordinary low-voltage plugs.
Someone should make a Kickstarter for a compatible charging connector that unambiguously avoids all the patents. Something dirt-simple like just a few bare metal pins sticking out providing voltage and alignment.
You can't. Apple built DRM into their cable with a proprietary security chip. All DRMed and patended up nicely to lock everyone out of just about everything.
From your link, the discussion doesn't seem to have a definitive conclusion:
The folks at Chipworks has done a more professional teardown, revealing that the connector contains, as expected, a couple of power-switching/regulating chips, as well as a previously unknown TI BQ2025 chip, which appears to contain a small amount of EPROM and implements some additional logic, power-switching, and TI’s SDQ serial signalling interface. SDQ also uses CRC checking on the message packets, so a CRC generator would be on the chip. Somewhat confusingly, Chipworks refer to CRC as a “security feature”, perhaps trying to tie into the authentication angle, but of course any serial protocol has some sort of CRC checking just to discard packets corrupted by noise.
So until someone finds a truly dumb Lightning charging cable, the question as to whether or not DRM prevents it is alive.
Changemakrs is about inspiration. We're creating a beautiful space for you to discover, collect and share quotes from the people who inspire you most. Our beginning was stevetold.us (which today is Changemakrs.com/stevejobs); it inspired millions of people hours after launch.
Ssebro, I assume you're asking because the landing page does not reveal much? If that's the case: most of our users come from Facebook, Twitter, Pinterest, etc. They reacted to a Changemakr's post that one of our users created. For this group of visitors we don't need to explain what it is they know & that's why they came in the first place.
However, for anyone else - the landing page is not ideal (yet); it's work in progress; expect this to change soon & often.
Agree with 'hard to get rid off'. But then again - given the vast amount of storage (incl. Apple's Cloud) - is it really a priority to provide app with a shelf-life for auto-deletion? Or a 'nice to have' feature, to keep screens free from clutter?
...if only ATT would introduce carrier pigeons; I'd finally have data throughput in the mission with my iPhone! ....But then again, it might end up being a +$20 option; and you d'have to sign-up for a 2y carrier-pigeon plan ;)
Is having a forced Facebook login necessarily a bad thing? Sure, the sign-up rate might only be 50%; but if it's inherently a social app, it'll pay of with a richer user experience once the user is 'signed-in with his graph'. Of course there are many apps that simply abuse the social grap for viral invites only - that sucks. But there are also apps that are just more fun - if they can overlay / provide access to the social graph. At the end: Every app attracts the users it deserves; those who appreciate it and those who don't. As an app developer, if my app is worth the trust of a social graph, I'd rather have the users that appreciate it.
Quite skewed analysis if it's based search terms, given that some parameters have shifted in the last years. Google used to be a jump-off/quick navigation point to go web-sites people were very familiar with. It was convenient instead of having to type in the web-site url to just Google it, and then go. But recently, most links are clicked withint/from a social media context; thus Google is becoming less and less a jump-off/quick navigation point. I would say, the graph shows the fall of Google more than the fall of Social Networks.
The graphs show both search volume index and news reference volume.
While I think your point should be considered, I not sure if those distortions are presented in the graph.
Google is still serves as the quick navigation point to get to the social networks.