Although I don't speak for Google, I did interview there myself, and I speak from that experience.
The election debates are like an interview. Google's hiring is respected around the world, and I'm confident that interviews structured like the political debates would never fly.
In fact, they would produce absolutely terrible hiring candidates; likely, even much worse than random.
Lightning-style opinions: 👎
A candidate can get away with almost any rhetorical lie. Zingers and double-binds are disproportionately effective compared to intelligent discourse. Diversion... ok, I'll call it what it is, bull-shitting, becomes the name of the game, since any time a question can be remotely related to a candidate's strongest weapon, it's worth it to do so.
Often, these weapons are completely devoid of substance of value to the country, because, let's face it, there is a reason why reality TV is so popular and it's not because it gets stuff done or moves us forward.
But all of these problems with the debate format really reduce to one primary weakness: shotgun-style questions with mere seconds to develop a case and demonstrate capability to solve, or at least make progress, on an issue.
Let's imagine how this would work in a job interview.
Interviewer: How would you sort 2 gigabytes of numbers if you only have 16 megabytes of memory? You have two minutes.
Candidate: The thing about gigabytes is that they are a measure of size when what you really want to discuss is performance. My opponent has shown a tremendous misunderstanding of computing performance. In fact...
Even if the topic is a little esoteric, I hope you get the point.
Let's imagine the candidate actually answers the question. If even possible to describe the solution, all it really demonstrates, in that minimal amount of time, is that they had prior experience with that exact problem.
No interviewer would waste two minutes asking for a solution to a complex issue like that. Instead, if each question could only be given two minutes for an answer, you'd ask general questions.
In two minutes, almost anybody can bring up a host of related terms and sound moderately educated. I know, I've done it, and consequently the interviewer didn't have much better idea whether I could actually put my knowledge into practice.
In fact, with flash questions and answers it's almost better to ad-lib on related problems, because it makes it sound like you are going to solve something without having to do the hard work of proving you have a better idea.
Cherry-picking, FUD, and colorful insults win this game. It's too easy to bloviate on a topic in general when there is no time to back up your ideas or claims anyway. And it's even easier to blow hot air if given a background question or one not oriented around a specific problem.
Problem solving: ðŸ‘
The real point of an engineering interview is to demonstrate an candidate's ability to solve new problems. This is why engineering interviews at the top companies, and best hirers, have largely moved from shotgun-style questions to at-length development of a solution to one or a few problems, with few, if any, open-ended questions about background or subjective topics.
The real point of a debate, or an entire election cycle, should demonstrate a candidate's ability to address the issues that they will be confronted with and work toward a positive outcome.
It's a waste of all our time if the format encourages candidates to:
parrot the party line
dredge up mistakes the other candidate has made
talk about how positive their traits are or how negative their opponent's
dodge a difficult question and replay their favorite brag or insult
riff on their list of pet policy positions that we have all known for months
And on top of that, it actually encourages simplistic, myopic views of the issues because absolutely no one is going to attempt to show they are aware of, or can/do think about conflicting values, tradeoffs, or compromises in two minutes.
If any of the issues could be solved without reasoned, thoughtful consideration of multiple angles and pros and cons, we wouldn't need politics, would we?✱
A debate, or interview, with such shallow question-answer formats is going to steer us to the candidate that can bluster most effectively, prattle off their prepared zingers, hit us with complicated "facts" that sound big, scary, or important, yet lack context or scale.
On the other hand, a developed interview that allows — rather, requires — candidates to use their knowledge and experience to dive deeply into a complicated issue would allow the interview (us) to:
see how a candidate's experience helps them form and explore an idea, not just hit us with background information or elevator pitches.
demonstrate a candidate's ability to relate complex ideas and explain, simplify, and analyze them.
allow a candidate time to show creativity in addressing competing values and benefits.
show us how a candidate's temperament affects their ability to make progress.
"I have better temperament." "No, I do!"
And above all I would hope it would start to remind us that politics are not about which way a candidate has predetermined to vote or fill their cabinet and the benches, but in fact the opposite:
How prepared they are to make any as-yet-unknown judgments or surround themselves with other leaders who can — and therefore, hope to make any progress — against the incomprehensibly complex and competing demands of a 300 million person country?
I don't presume that I have any possible hope of getting the debate format to work better.
Rather, I'd hope that I've laid out good reason why we should completely ignore what we think we may learn from debates regarding capability surrounding issues and solutions.
All we get in the debates are either simplistic talking points or dramatainment. We can find the former on just about any website in a matter of minutes (and don't change for 2 years prior to the election lest the candidate be labeled a flip-flopper). And if we crave the latter, simply tune into any reality show. Too bad The Apprentice is currently off the air.
One positive thing I think we may be able to extract between the lines is this:
How well do they work with others?
On that point, I have only one possible major party candidate who I believe has any hope of working with others, a shred of humanity. I haven't brought up any political issues here, but I'll bet you can guess.
If you are one of about, I don't know, pretty much everybody, who is currently saying we need a nuke to the system because it's all broken, this is required reading. The problem is not politics, it's antipolitics — forgetting that we must compromise. It was written 9 months ago but is more insightful, applicable, and prescient than ever.
Brooks:
We live in a big, diverse society. ... The answer is ... acknowledging other people exist. It’s taking pleasure in that difference and hammering out workable arrangements. As Harold Laski put it, “We shall make the basis of our state consent to disagreement. Therein shall we ensure its deepest harmony.â€
This is a list of albums I've listened to on repeat for many cycles in my life, in rough chronological order. I've excluded compilation albums, because, like, that's not fair.
(bolded are a few all-time favorites, with obvious recency bias)
Cherry-picking online arguments and tearing them down is a favorite pastime of people with a soapbox and either no opponents or a weak platform in need of some easy wins.
But hear me out, these are comments from a friend and their friend, from a religious circle close to mine, and a view of non-believers I think many may relate to, so I'm going to do exactly that.
Context:
"For many faith [or, life] is not about finding peace but rather trusting that the struggle is meaningful."
Francine:
And for atheists, life is essentially meaningless.
Warren:
Life isn't quite meaningless for atheists. They may find meaning in family, hobbies, work, personal achievements, and more, but, those things provide temporary meaning, and so one is often drifting from place to place, thing to thing. Faith provides tangible, lifelong meaning that carries you through all walks of life.
Francine:
One may make their own meaning in life but overall, there is no purpose. No plan behind the scenes.
This appears to be meant to illustrate a fundamental lack of value of atheism or, I'd rather say, absence of a belief-positive in god.
I am surprised by, and disagree with, both assessments, though the first moreso.
I don't believe there is a fundamental lack of value in the absence of a god, in fact I believe the opposite and here's why.
Some background about me
I don't believe in a specific god, only the possibility of a god. And I believe in the certainty of much more than we know now. In fact, infinitely unknown -- and therefore likely of what we'd call "spiritual" nature.
This is written as if spoken to the original commenters.
Observer-actor disconnect
First, assuming that an atheist experiences life the way you imagine them doing so is a fallacy -- called the Pyschologist's fallacy (wikipedia). You're assuming your own objectivity for analyzing not just a fact but someone else's subjectivity.
You don't know whether I experience meaning in my own life or of what nature that meaning is. You can only know that you imagine it doesn't feel like what you'd place meaning in.
How do I find meaning?
Second, I'm here to tell you life feels far more meaningful to me now. I don't know how you experience belief in god, but I experienced it as a life void of my own meaning (purpose). All my usefulness in life was defined externally. There was no need to discover self-reward, or intrinsic meaning, because I was told my actions would bring about a literal external (extrinsic) and eternal reward.
As it was, I was also always putting my actions to the judgment of an external set of rules.
When I decided that I didn't care whether I got a reward or not, as long as I knew that as time passed I had been a good person --
-- by that I mean, no matter what actions I take, I will always try to do better --
-- then I realized this meant I believed I was a truly good person, with a good heart. And with innate good inside, I could realize that external rules were actually a crutch keeping me from developing a better conscience. If I kept leaning on those rules, my internal compass never needed to strengthen. The only thing that would increase would be my confidence of my behavior matching the rules.
Of course, this belief doesn't jive with your core beliefs. That's fine. However, you're stating here that believing otherwise makes life meaningless. But my experience, as the person you are talking about, shows differently.
This change of mind allowed me to value myself and my lived experience as the thing most important at the center of my circles, not an external set of rules and externally promised future reward of which we have never observed.
When I saw that value, it follows that every life has equal value at the center of their own circles, which overlap and affect me. So every lived experience is the meaning itself.
We have no knowledge of what will follow -- only beliefs. So a belief in no god (or the likely absence of a god) actually put the entire meaning of one's experience into life itself, based on -- 1. ourselves and 2. every person we interact with, from each and every person's own perspective.
This is the principle "love your neighbor as yourself." We value what we love, therefore life has meaning.
What kind of meaning is it?
Is it temporary, or less inherently valuable?
An atheist only has possibility of meaning of life by the awareness of their own existence. But the same goes for a believer of God. So the potential for meaning is unlocked for both.
So why is the atheist life, just because one may believe they came from dirt and will return to dirt, ascribed to have less meaning?
A believer in god may suppose that the difference in the "meaning" is (from some statements above):
the ever presence of what their purpose in life is
the everlastingness of their life
the reward to come
a greater purpose, or plan behind the scenes -- an external plan which is about all of us.
but contrast this with me:
1. All of this purpose and belief I'm telling you now also go with me everywhere. If I got knocked out, lost all my things, and taken to a foreign country with just my life and my mind, I'd still have all the same meaning.
2. Maybe a longer existence gives more meaning?
Comparing "forever" to a fixed lifetime is a function of time progressing, or in other words, future events.
If an atheist believed in permanence of their influence, an atheist has the same potential for meaning from everlasting purpose.
But even with a belief in a likely dissipation of their life's influence on others over time, an atheist has only the present to apply sure meaning.
Given that the present is what we actually sense, not future events, surely this means the atheist has as much or more faith in the meaning of their present existence?
3. Maybe it's because faith gives knowledge of an immense worth of the reward?
Whether the reward is valuable because it's ongoing or because the experience is so much better than we now have -- you may conclude differently -- I don't believe that some ongoing reward is better than the current experience.
Life itself; the events, experiences, senses, feelings, thoughts, have an intrinsic reward.
Imagination of experiencing this, or any other good feeling I've had, as a reward forever is not an actual experience, so it does not hold intrinsic value to me.
Therefore, my meaning -- as I experience it -- is greater than what I logically gained when I was trying to live life for the future reward.
4. Maybe the meaning comes from knowing we're all working together, a super-plan or the value is for a master plan or all people?
Again, go back to my first statement. Life is the reward, and therefore all lives have a reward. This value is every bit as great to me as if it were part of plan to which we are all bound.
And if it was some master plan, I don't value the feeling of believing I'm doing something according to a grand design any more than I do the inherent experience. That's just me, but again I am the one who observes and therefore values my own meaning.
Family, hobbies, work, personal achievements are merely the actions I may participate in or seek in this already-meaningful system of values. The people and my own experience are the inherent value, so as I said it goes with me no matter what I have or where I go. Time progresses, and as I experience things, meaning therefore exists.
"Drifting from place to place, thing to thing" actually makes it sound like it adds meaning to me, because the variety of my experience increases, instead of repeatedly driving the same path, becoming more numb to the small differences with each passing.
I'm sure god-believers' lives have meaning under their belief systems; but that doesn't mean that they can know whether mine holds meaning for me or not. And while this perspective may not provide meaning for all without a god-centered system, this is hopefully an explanation of why it does for me.
Cool front-end developers are always pushing the envelope, jumping out of their seat to use the latest and greatest and shiniest of UI frameworks and libraries. They are often found bridging the gap between native apps and web and so will strive to make the UI look and behave like an app. Which app? you may ask. iPhone? Android? What version? All good questions, alas another topic altogether. However, there is another kind of front-end developer, the boring front-end developer. Here is an ode to the boring front-end developer, BFED if you will.
Before you read anything I have to say, go read "Can-Do vs. Can’t-Do Culture" (5 min read) by Ben Horowitz (founder, Andreessen Horowitz). I owe it complete credit for inspiring my one, and only, New Year's Resolution:
2014: Be Stupider
The trouble with innovation is that truly innovative ideas often look like bad ideas at the time. That’s why they are innovative — until now, nobody ever figured out that they were good ideas.
...
They focused on what the technology could not do at the time rather than what it could do and might be able to do in the future.
...
Don’t hate, create.
- Ben Horowitz
Being smart sounds right. So why should I be more stupid? Because the world is full of smart reasons why stuff can't be done.
Stupid, on the other hand, can only try.
I could sit here literally forever and wait for the perfect pitch, the one that is the easiest to knock out of the park, or I could just start swinging.
This concept reminds me of several quotes from the fantastic "Win Like Stupid" article I read a few months ago (hat tip Jason Waters). Some of my favorites:
TOO STUPID FOR CAN’T
STUPID PEOPLE NOT THINK ABOUT CAN’T WIN AT ALL. THEM JUST DO IDEAS UNTIL ONE CAN.
STUPID PEOPLE TOO DUMB FOR ODDS. THEM JUST ASSUME NEXT THING WILL WORK.
CHANCE IF NOT TRY EXACTLY NOTHING.
EVERY SMART PERSON TERRIFIED EVERYONE THINK THEM IDIOT.
STUPID PERSON ALREADY IS ONE, NOT MIND IF PEOPLE KNOW.
WORLD SMART. IT HARD TO OUTSMART WORLD. BE IDIOT. OUTSTUPID WORLD INSTEAD.
Overthinking is the enemy of doing. That perfect time to leave the gate will never come. Most of us have spent plenty of time learning how to not fall down. Just get on the track and run.
So, Happy New Year. The newer stupider me is going to have a great 2014!
It's an initiative for knowing the people behind a website. It's a TXT file that contains information about the different people who have contributed to building the website.
...
The internet is for humans...
We are always saying that, but the only file we generate is one full of additional information for the searchbots: robots.txt. Then why not doing one for ourselves?
(My thanks to the excellent HTML5 Boilerplate project, which includes a humans.txt in their project, for pointing me to this.)
First guy: "Is he texting and watching a video at the same time?"
Second guy: "Hey, what are you doing?"
Third guy: "I'm texting and watching a video at the same time."
First guy: "You can't do that."
Second guy: "You can't do that."
Third guy: "And yet I'm doing it."
First and second guys: "Yeah - nice!"
That's right, this ad touts one of Samsung's newest features: texting and watching a video at the same time.
That should say "screen images simulated. Final version may actually line up correctly".
For an OS that seems to be marketed at robots (queue "Droid" voice), maybe this has been a long-time coming. But for humans like me, for whom input and output rarely mix, and who are already working on a screen the size of their palm, I have my doubts about the real-life usefulness of this feature.
My phone already plays audio at the same time as I can text or use other apps. In contrast to Samsung's latest, one feature I would love is for an easier way to pause the audio. When I'm listening to a podcast, I frequently take notes — but I can't do it at the same time, or I end up rewinding anyway.
But it's a feature, and features sell, right?
Is playing video on the same screen as a text composer difficult? Technically speaking, no. It might be a design challenge to make the experience seamless and handle the different modes. But even then, it's achievable.
It's a great read about avoiding featuritis. Ballard's premise:
Users are not really compelled by features; they care only about achieving their goals. The real cost of adding excess features to digital products is unhappy customers.
I understand that Samsung has to innovate (especially given recent events). But let's remember that innovation doesn't always have to be additive.
The Sad User Slope
Hitching your horse to the "features-sell" wagon is actually a very dangerous game.
This is aptly illustrated in Ballard's article with the Happy User Peak, originally published by respected author and creator of the Head First book series, Kathy Sierra.
Unfortunately, so many of these features actually fall on the Sad User Slope. Where do you think texting while watching a video falls?
What puts a product onto the happy user peak, or the sad user slope? Ballard's definition is simple:
What turns a pleasurable solution into a cumbersome tool is the number of
visible features that are irrelevant to the user’s goal.
Why this happens
These concepts aren't revolutionary. Sierra's post about featuritis is from 2005. Don't make me dig up a post about what phones were like then. (Hint: one term for them is feature phones.)
So, let's assume that Samsung isn't filled with idiots. Why is it so hard to resist adding features where the human value is so low — and then create commercials about them?
Ballard:
Knowing exactly what features will make an elegant solution depends on establishing a deep understanding of the users’ goals. When we fail to know what the users’ behaviors, motivations and goals are, we create complex and burdensome tools. This is because instead of first designing digital products with a real understanding of the user, we cast a wide net of features with the hope that we will capture a wide audience. We wrongly feel that the safe way to develop products is to provide as many features as possible. This is self-defeating to our goal of providing real solutions. The secret to success is to get to know your user and be specific with your solution. In short, focus on your customers’ goals and not on features. (emphasis added)
And why wouldn't we think that our users will want these features? After all, they tell us they do:
Users will almost always say yes to more features. “Would you like?†questions will generally be received with a positive response even if the feature will not help users be successful. The idea that “more is more†is so compelling, and users are unable to visualize the experience that will result from applying
that approach.
Just say no.
Apple is famous for their focus. Recently, John Gruber linked to a fantastic article titled "The Disciplined Pursuit of Less", written by Greg McKeown for Harvard Business Review. In it, McKeown puts forward what he calls the Clarity Paradox:
Why don’t successful people and organizations automatically become very successful? One important explanation is due to what I call “the clarity paradox,†which can be summed up in four predictable phases:
Phase 1: When we really have clarity of purpose, it leads to success.
Phase 2: When we have success, it leads to more options and opportunities.
Phase 3: When we have increased options and opportunities, it leads to diffused efforts.
Phase 4: Diffused efforts undermine the very clarity that led to our success in the first place.
Curiously, and overstating the point in order to make it, success is a catalyst for failure.
In his own post, Gruber suggested that Apple's uncommon focus has allowed it to keep clarity and overcome this pattern.
With the resulting clarity, Apple keeps its products focused on the goal of providing exactly what users need.
Steve Jobs once said:
I'm as proud of the products we have not done as I am of the products we have done.
Replace products with features and you get a similar point. That sentiment is echoed by Jobs himself, as recounted by Ballard:
In a meeting between Steve Jobs and some record label representatives concerning Apple’s iTunes, Jobs kept fielding questions like "Does it do [x]?" or "Do you plan to add [y]?"
Finally Jobs said, "Wait, wait — put your hands down. Listen: I know you have a thousand ideas for all the cool features iTunes could have. So do we. But we don’t want a thousand features. That would be ugly. Innovation is not about saying yes to everything. It’s about saying NO to all but the most crucial features."
We don't yet know if this feature makes an ugly phone; Samsung's screens are simulated, but even that version doesn't give a lot of hope. Neither does the fact that users are having trouble finding how to use it.
Features don't need to be fancy. They don't need to be gimmicky or do new tricks. They just need to get done what the user wants or needs.
Just say no; and allow the features that are done right the spotlight they deserve.
Stripe is a web payments company whose engineering team get web security. They launched a hacking contest. Joseph Tartaro of IOActive has kindly compiled this writeup of the solutions.
It is a must-read for anyone interested in web security. Wait, scratch that — for anyone who even touches web application code.
In February, the engineering team at Stripe (easy, secure web payments) created the first Stripe Capture the Flag, a "security wargame" intended to test your ability to find exploits in vulnerable code. This event was largely based on understanding of Unix systems, C exploits, with one PHP exploit thrown in.
The original event was a huge success, with attention from Hacker News as well as Reddit (of course).
A few days ago the team released Stripe CTF 2.0 which they are calling "Web Edition". They stepped up the support systems for this one, with logins, a leaderboard, and public code on GitHub. But what's even better is the type of exploits that are covered:
The Retina MacBook is the least repairable laptop we’ve ever taken apart: unlike the previous model, the display is fused to the glass—meaning replacing the LCD requires buying an expensive display assembly. The RAM is now soldered to the logic board—making future memory upgrades impossible. And the battery is glued to the case—requiring customers to mail their laptop to Apple every so often for a $200 replacement. ... The design pattern has serious consequences not only for consumers and the environment, but also for the tech industry as a whole.
And he blames us:
We have consistently voted for hardware that’s thinner rather than upgradeable. But we have to draw a line in the sand somewhere. Our purchasing decisions are telling Apple that we’re happy to buy computers and watch them die on schedule. When we choose a short-lived laptop over a more robust model that’s a quarter of an inch thicker, what does that say about our values?
Actually, what does that say about our values?
First of all, "short-lived" is arguable, and I'd argue for "flat-out wrong". I don't take enough laptops through to the end of their life to be a representative sample, but I've purchased two PC laptops and two MacBook Pros. After two years of use, both PCs were essentially falling apart (hinges, power cords, and basically dead batteries) while the MacBooks were running strong.
My 2008 MacBook Pro did get a not-totally-necessary battery replacement after a year, but my 2010 has run strong for two years. I'd expect nothing less from the Airs or new MacBook Pro. So short-lived might be a relative characterization if anything, and only if you consider the need to pay Apple to replace your battery instead of doing it yourself a "death".
Second, and more important, this thought occurred to me: when we look at futuristic computing devices in movies such as Minority Report or Avatar, do we think "Neat, but those definitely don't look upgradeable. No thanks."
Do we imagine that such progress is achieved through the kind of Luddite thinking that leads people to value "hackability" over never-before-achieved levels of precision and portability?
The quote above is the summation of Wiens' argument that "consumer voting" has pushed Apple down this road, and that we need to draw a line in the name of repairability, upgradability and hackability.
I'd argue that Apple's push toward devices that are more about the human interface and less about the components is a form of a categorical imperative, a rule for acting that has no conditions or qualifications — that there is no line, there is only an endless drive towards progress: more portable devices that get the job done with less thinking about the hardware.
That is what drives descriptions like Apple uses in its product announcements: magical, revolutionary — not hacking and upgrading.
Approved this morning, Mac App Store tool Folderwatch — an app that monitors, syncs and mirrors important files automatically — included a small detail in its update notes, stating “Retina graphics†were a new addition to the app.
Rumor is that more than just the Airs will be getting Retina displays:
We reported earlier in the week that Apple may have plans to announce updates to many of its Mac computers at WWDC. The models to be updated include the 15″ MacBook Pro, the 11″ and 13″ MacBook Air models and the iMac.
All of the various portable models being updated are rumored to be getting a Retina display boost.
I've been thinking for a while that the next round of Macbook Airs are perfectly positioned for retina displays, so I think that's pretty much a given at this point. A colleague of mine is thinking that touchscreen displays might also make an appearance -- which would justify the scrolling direction switch in Mac OS X Lion last year.
Yes, nineteen. Zero is another significant number: the number of times I've downloaded the new version of Chrome.
Jeff Atwood wrote about this phenomenon, and the eventual paradise of the infinite version a year ago, stating that the infinite version is a "not if — but when" eventuality.
Chrome is at one (awesome) end of the software upgrade continuum. I've rarely noticed an update to Chrome, and I've rarely checked the version. A couple of months ago I did, when a Facebook game I was working on suddenly started crashing consistently. The problem was tracked to a specific version in Chrome, and within hours, I heard that an update was released. I checked my version again, and Chrome had already been updated with the fix.
This nearly version-free agility has allowed Chrome to innovate a pace not matched by IE, Firefox, or even Safari. I swear the preferences pane has changed every time I open it (not necessarily a good thing).
At the other end you have the enterprise dinosaurs; the inventory, procurement, accounting systems that are just impossible to get rid of — I'm thinking of one local furniture retailer which still uses point-of-sale and inventory systems with black/green DOS-looking screens on the sales floor.
Most other software industries fall somewhere in between, trying to innovate, update, or even just survive while still paying the backward-compatibility price for technical decisions made in years past.
Check out this gem alive and well (erm, alive, at least) in Windows Vista:
Look familiar? That file/directory widget has seen better days; it made its debut in 1992 with Windows 3.1:
Supporting legacy versions is not just a technical problem, though. For some companies and products it's just in their nature to be backward compatible. And sometimes to great success; take Windows in the PC growth generation, or standards like USB, for example. For some opinionated few, backward incompatibility is downright opposed to success.
And then there's Apple.
Apple may not be a shining example of smooth upgrades, but they aren't shy about doing it.
Anyone who knows Apple knows they are just about the least sentimental bastards in the world. They'll throw away all their work on a platform that many were still excited about or discontinue an enormously popular product that was only 18 months old. They have switched underlying operating system kernels, chip architectures, their core operating system API, and notoriously and unceremoniously broken third-party applications on a wide scale with every new OS (both Mac and iOS).
OK, so they aren't entirely unsentimental — Steve Jobs did hold a funeral for Mac OS 9, about a year after OS X replaced it on desktops.
All of this allows Apple to put progress and innovation above all else.
Steve Jobs, in his famous "Thoughts on Flash", made clear his view that Apple doesn't let attachment to anything stand in the way of platform progress:
We know from painful experience that letting a third party layer of software come between the platform and the developer ultimately results in sub-standard apps and hinders the enhancement and progress of the platform. If developers grow dependent on third party development libraries and tools, they can only take advantage of platform enhancements if and when the third party chooses to adopt the new features. We cannot be at the mercy of a third party deciding if and when they will make our enhancements available to our developers.
This attitude has paid off for iOS. Analysis of app downloads shows that uptake for new major versions is rapid. iOS 4.0 just barely got going before it was traded in for iOS 5.0.
The story is not the same for Android. MG Seigler reveals (via DF) that Android 4.0.x "Ice Cream Sandwich" has only seen 7.1% uptake in 7 months. As Seigler notes, Google is expected to announce the next version this month, which would likely occur before ICS even sees 10% adoption.
Two Googles
So at one end of the upgrade-nirvana spectrum, we have: Google. And at the other end: Google.
How could they be so different?
A lot of people want to draw tight comparisons between the mobile OS wars going on now and the desktop OS wars that Apple lost in the 80s and 90s.
If we do that, for a minute, we might actually see some similarities. Apple currently supports six iOS devices, with a total history of twelve. By some accounts, there may be as many as 3997 distinct Android devices.
Android is also "winning" in market share, with 51 percent of the US smartphone market vs. iOS's almost 31 percent, according to March 2012 Comscore data. Gartner numbers put Android ahead even further, with 56 and 23 percent, respectively (and this has been the story for some time).
Or is it?
Isn't all smartphone market share equal? What's interesting about these numbers is just how different they are from web usage numbers. Shockingly different.
"Smartphone" market share numbers suggest Android is winning, yet certain usage statistics still put Apple in a comfortable lead. Meanwhile, Apple seems quite comfortable blowing away growth estimates again and again while nearly every other device maker is struggling to turn a profit. What does Apple know that the industry doesn't?
Has anyone else noticed a lack of "dumbphones" (non-smartphones) around them lately? Android phones have largely replaced these non-smartphones or feature phones in the "free" section of major cell phone providers' online stores and store shelves alike. This means customers who walk into a store looking for a new phone or a replacement but not knowing (or really caring) what they are shopping for are often ending up with an Android phone — just like they ended up with feature phones in years past.
That's great for market share. But how is it great for business? Generally speaking, these people don't buy apps. They don't promote or evangelize. They aren't repeat customers. Their primary business value is to the service provider; not to the device maker, not to Google.
And they are extremely unlikely to update their operating system.
The good news to Google is that these users who don't upgrade don't represent a large cost to abandon, should they decide to make innovative, backward-compatibility-breaking changes in future versions.
The bad news is that third party developers will be afraid to support new (and possibly risky) versions, or break compatibility with old versions in order to adopt new features. All reasonable version distribution statistics, such as the new Ice Cream Sandwich numbers, show basically no adoption at all. So why take the risk?
I get the feeling that Apple will be happy to continue to innovate and bring new and better features to its customers; and Android will continue to match feature-for-feature on paper, but not where it counts: in the App Store.
The articles on Rands keep getting longer and longer, and as I’m finishing a piece, I worry, “Is it too long?†I worry about this because we live in a lovely world of 140-character quips and status updates, and I fret about whether I’ll be able to hold your attention, which is precisely the wrong thing to worry about. What I should be worried about is, “Have I written something worthy of your attention?â€
Houzz.com on "magic mirrors" — computerized touch surfaces on the mirrors and windows in your home:
Magic mirrors and magic windows — in fact, magic glass surfaces all over the house — will soon become commonplace, thanks to breathtaking advancements in computers, computer interfaces and, of all things, glass.
Count me as a skeptic on the word "soon". This technology barely exists, let alone having a good reason to (yet).
Devices should be getting more mobile, not less. To be successful, innovations should also solve a problem. I don't remember having the urge to check my email while leaning over the bathroom sink. It's also fairly counterproductive to OCD-types like me: "Now introducing smudgy screens all over your house, not just in your pocket."
The whole thing reminds me of the Google Glasses: why do we need it? Speaking of which, they've popped up in the news again; it turns out it won't be nearly as cool as originally pitched.
... there are a lot of folks who think gamification means pulling the worst aspects out of games and shoving them into an application. It’s not. Don’t think of gamification as anything other than clever strategies to motivate someone to learn so they can have fun being productive.
The ever-insightful Lukas Mathis took the thoughtful middle ground. After I read his piece, I started a draft of my own. Now that draft will probably never see the light because, although the whole article is worth reading, Lopp has in three sentences summed up my feelings.
Of course, if your purpose in using gamification is anything other than helping the user enjoy learning and to be productive, then you'd do well to hear from the critics how you might be making your users feel about your software.
Creating or expanding business relationships is not about selling – it’s about establishing trust, rapport, and value creation without selling. ...
Engage me, communicate with me, add value to my business, solve my problems, create opportunity for me, educate me, inform me, but don’t try and sell me – it won’t work. An attempt to sell me insults my intelligence and wastes my time. Think about it; do you like to be sold? News flash – nobody does. Now ask yourself this question, do you like to be helped? Most reasonable people do. The difference between the two positions, while subtle, is very meaningful.
Great article, and a lot to think about. A corporate goal like "increase sales by 50%" can be taken two ways.
One way would be to imagine that more people need to be convinced to buy your product.
The other way is to consider how you can add value and find the customers who most need what you are offering.
Create goals that communicate your actual intended action and aren't open to interpretation. Measurements should be customer happiness levels, not dollars of income (that may or may not have been pried from your customers' unwilling fingers).
Focus on the reason that your product exists and help it help people.
Looks like it's downfall-prediction time in the popularity lifecycle for Facebook.
Geoffrey James of Inc. Magazine published an editorial which argues that Facebook, unlike LinkedIn, is vulnerable to being ditched for "something 'cooler'":
LinkedIn is all about business and people's resumes. Because its scope is limited to fundamentally dull information, LinkedIn is simply not vulnerable to something "cooler."
Sure, somebody could launch a site similar to LinkedIn. (And I'm sure plenty of people have.) But why would the customer base bother to change? Nobody on LinkedIn cares about being cool. LinkedIn's beauty is that it's dull but functional–like email and the telephone.
Fair point. Being popular because you're cool does make you vulnerable to the next cool thing. But he doesn't make good arguments — or any at all, really — for his case that people have been using Facebook because it's cool.
Specifically, this argument falls flat for me:
Consumer-oriented social networking sites are like television networks: People will switch when there's something better on another channel.
Actually, consumer-oriented social networking sites are nothing like television networks. Exclusive content provides customers a reason to use more than one network. And, most importantly, there's no cost to switch to something better on another TV channel — in fact, it's common to switch back and forth.
Facebook, on the other hand, with years of posts, photos, and other social interactions (yes, many of them useless), as well as a large current audience, has a huge cost for a user that wants to "switch".
James does address that:
Frankly, I think it's just one online conversion program away from losing its customer base and becoming the next MySpace.
An "online conversion program" might provide a way to minimize the data loss, but Facebook has a much larger asset: market momentum.
ComScore analysis shows that 69% of North American internet users have Facebook accounts, according to a CNET article. People use Facebook because other people use Facebook — their friends, specifically. That's been one of the primary drivers of its virality, and it is now the reason for its ubiquity. In that sense, Facebook is more like email or the telephone than LinkedIn.
No online conversion program is going to move their friends over to the "next" Facebook.
A dead-simple, free, JSON API for retrieving the city and state from a zip code. And, a sample/beta jQuery plugin to show how easy it is to auto-populate those fields when the zip code is typed. Could this be an end to the years of extra-typing madness?
Great article in Forbes by Warren Buffett for their "When I Was 25" series.
Warren Buffett, age 25:
Although I had no idea, age 25 was a turning point. I was changing my life, setting up something that would turn into a fairly good-size partnership called Berkshire Hathaway. I wasn’t scared. I was doing something I liked, and I’m still doing it.