Saturday, May 20, 2017

To Eden on Her Sixth Birthday: A Plea for Education Reform

“Now the problem with standardized tests is that it's based on the mistake that we can simply scale up the education of children like you would scale up making carburetors. And we can't, because human beings are very different from motorcars, and they have feelings about what they do and motivations in doing it.”
        Sir Ken Robinson.

“It’s okay, Uncle Ben. I'm sure I'll appreciate him when I’m grown up.”
Eden, handing me back the headphones and letting me down easy after her first experience with Bob Dylan.

“You know… it's not a good thing that hammocks don't have seat belts.”
         Eden, having just fallen out of one.

“Only by counting could humans demonstrate their independence of computers.”
Douglas Adams. Hitchhiker’s Guide to the Galaxy.
My niece, Eden, turned six recently. She came to visit us in California for her birthday. We went to Disneyland together and spent a few days at home as well. Typical spring break for a six year old living in New Jersey. Eden and I end up spending quite a bit of time together when she visits, mostly sitting in the backyard and talking. She relishes the California sunshine and warmth after a cold New Jersey winter spent cooped up in the house. And I, too, admit to being a sun worshipper, enjoying the backyard and California’s seemingly perpetual springtime.

These conversations with Eden can revolve around anything: recent events in school, what happened on a cartoon she was recently watching, the latest happenings on Wall Street, what Donald Trump said about the Middle East crisis, you name it. I kid you not. I have learned to talk to her like an adult and am constantly amazed by how much she can absorb. Her mom recently sent us a video she had taped of Eden in their living room as she pretended to deliver the evening news while sitting in a cardboard cutout TV. Her monologue included relevant commentary on the political situation in DC, the weather in LA and Jerusalem for the next week, the stock market, traffic on the George Washington bridge, and other topics she had picked up on TV. She obviously doesn't understand all the nuances but is smart enough and has learned enough to participate meaningfully in an adult conversation - or, in this case, presentation.

Eden, in my honest opinion, is brilliant. She speaks three languages fluently without ever having been “taught” them. She can play tunes on the piano without ever having learned to read notes. She has an amazing ear for music, a seemingly magical ability to someone like me who couldn't speak “piano-ese” any more than Chinese. She already reads at a third grade level and is, consequently, often bored in school given that she's still stuck in kindergarten! I’m sure the teachers struggle to keep her engaged in class.

Her high level of intelligence also means she often gets frustrated with the “childish” environment she is forced to inhabit. For the first couple of days of her visit, I couldn't get her to read even a picture book, one that should have been a piece of cake. She refused to concentrate and intentionally misread simple words. Once I caught her not even looking at the page as she recited the words, apparently from perfect memory of a previous reading. She was telling me she was bored with the exercise, that it was too simple. Then I put a much harder book in front of her, one with a hundred words per page instead of just ten. She immediately started reading with no problem whatsoever!

Every time I sit with Eden, I'm reminded of this amazing TED talk by Sir Ken Robinson on education and how we need to completely rethink our approach to it for the next century. Children need individualized attention and they need to be challenged mentally. Our model of education, however, is still rooted in rote memorization, standardized tests, and principles based on the needs of the industrial revolution and the eighteenth century. We take these brilliant minds and force them to sit through a dozen or more years of institutionalized hell called primary and secondary education, memorizing formulas and theorems so they can answer multiple-choice questions on a test before promptly forgetting them. I’m convinced kids are interested in everything that we bother to make interesting for them. If they lose interest in a subject, chances are it's not because they don't “get” it but rather that they didn't “get” an earlier more important concept, quite possibly because the teacher didn't make the topic interesting.

A scant few will get the privilege of working with amazing teachers who will challenge them while the vast majority will be marginalized by an education system that looks backwards instead of forwards. How else do you explain this (and other) popular YouTube videos showing children in Indian villages calculating large sums by mentally simulating an abacus for their proud teacher? You can see them fidget with their fingers as they recreate a mental image of an abacus. Is this really how we want to educate our children? Is this really the skill they need to practice for hours on end so they can be successful in the twenty first century? This is an extreme example but I claim most of the world’s children don't go through a much better educational system.

Here we are with “blank slates” that hunger to learn, children who have the mental capacity to pick up three completely distinct human languages in just a couple of years of ad hoc practice, who teach themselves to play the piano, take your pick of amazing skills you’ve seen children display. We take these geniuses - there is no other word for it - with the massive computing engines they carry around all day, and we sit them down and tell them to memorize formulas that they will never need - instead of helping them understand the deep principles behind those formulas, instead of teaching them to seek answers and not just memorize them.

After we’ve crammed their heads full of data for sixteen to twenty years, we tell them they’re all set for the rest of their lives and send them out into the workforce. This may have worked well when the average life expectancy was forty but it’s a recipe for disaster now that it’s eighty and creeping towards one hundred. What we learned half a century ago in school, assuming we even remember much of it, is stale by definition and no longer relevant to today’s - let alone tomorrow’s - needs. This is a problem we have to address if we're ever going to solve some of our biggest societal problems today. The longer we ignore it, the more we will create a generation who cannot compete effectively in the information age, will feel marginalized, and - in the right countries, with the right influences - will become radicalized. We need a model for continuous lifetime education, one that teaches children to think and learn for themselves in the long run - for the joy of learning, not because they need to make money next year.

But, back to Eden. She has long known how to Google things for herself, order apps on her iPad, watch videos on YouTube, play Words with Friends against the computer, and much more. Compare this to the intellectual universe available to a six year old a century ago - or even thirty years ago. There is no comparison. It is amazing how quickly our brains have stepped up to handle massively larger amount of information coming at us 24x7. We are only now starting to realize the extraordinary cognitive and pattern matching abilities of the human brain. But, still, we choose to take these amazing supercomputers while at the peak of their learning abilities and lock them into rooms for hours a day, teaching them... to hate learning! Nothing short of a revolution in how we think about education will ever fix that.

Having just returned from Disneyland where she stood, starry-eyed, taking pictures with her favorite princesses, I felt a bit subversive as I gave Eden this t-shirt as her birthday gift.



Thursday, May 18, 2017

We Didn't Know What We Didn't Know: WannaCry and the Case for SaaS

“There are known knowns. These are things we know that we know.
There are known unknowns. That is to say, there are things that we know we don't know.
But there are also unknown unknowns. There are things we don't know we don't know.”
Donald Rumsfeld. Former US Secretary of Defense.

“Hedley Lamarr: Unfortunately there is one thing standing between me and that property: The rightful owners.”
Harvey Korman. Blazing Saddles.

“Plan to be spontaneous tomorrow.”
Steven Wright.

I watched in horror last week, as did many of you I suspect, as the WannaCry ransomware crippled thousands of systems around the world and wreaked havoc in almost every country on the planet. I have, you might say, slightly more than a casual interest in the matter. For several years, I managed the team responsible for the SMB protocol, the vector used for the attack, and I was also the head of security for the company for a few years.

I didn't personally write a single of line of code in that protocol. That task was delegated to much smarter people; but I did manage the teams responsible for building, testing, maintaining, advancing, and securing it for several years. I was shocked, like everyone else, to see it at the center of an international meltdown of unprecedented proportions. Almost two hundred countries impacted? Over two hundred thousand businesses stymied? Hospitals? Emergency rooms? WTF?!?

For the uninitiated, here's a brief summary of the situation. If I understand correctly, the ransomware takes advantage of a previously undiscovered bug in the SMB file sharing protocol to take over a computer, encrypts all the files, and puts up a message telling the owner to pay up or lose their data. The NSA had known about the bug for years but didn't disclose its existence so it could be used as an espionage weapon. It was only discovered “in the wild” a few months ago when a group called the Shadow Brokers leaked a ton of NSA documents through WikiLeaks. Microsoft fixed the bug back in March but it's been present in all versions of Windows for years and many companies were caught off-guard as they didn’t realize the potential impact and didn’t deploy the patch on their systems.

This is a protocol, mind you, that was designed back in the eighties when computer networks were still few and far in between, was first shipped in 1990, was standardized as part of the CIFS protocol later in the nineties, and has been used for the past twenty five years for file sharing in every Windows and Windows-compatible product in the world.

Now you take one of these servers that, benignly, implements this file sharing protocol so you can… guess what… share files across a supposedly secure local area network, you add a pinch of magic dust and send it a really screwy malformed request, one that no sane human being would ever send in a reasonably written piece of software. This malformed request, in turn, triggers a bug in the implementation of the SMB protocol that allows the caller to gain supervisor access to the system. Game over. You can encrypt all my data behind my back and ask for ransom to release it.

Microsoft’s Brad Smith immediately blogged about the need for all parties to share this kind of vulnerability information in order to secure software. It is inconceivable to me, knowing what I know about the teams and process at Microsoft, that they would not have fixed this bug had they known about it. I am not here to apologize for Microsoft or the Windows team or the SMB protocol or the history of computer science. I’m here only to say simply that more such bugs will be found in the future, for the simple reason that “we didn’t know what we didn’t know back then” and it’s crazy to continue to depend on such software in today’s world where billions of people are connected to the internet, where nefarious actors abound, and where automated tools can be used to sniff out vulnerabilities.

We spent years designing this software. We spent years testing it. We spent years standardizing it in cross-industry committees and sharing it with partners. We spent years building a community around a protocol that is supported by millions of servers around the world. Our goals at the time were primarily interoperability, usability, and compatibility. We even spent thousands of man years fuzz testing the APIs to make sure attackers couldn't trigger vulnerabilities in the code. We used specialized tools that generated all kinds of random patterns in the arguments and we worked hard with the community of white hat security experts around the world to, responsibly, document and fix security related bugs in all our software.

But guess what. No one tried this particular random pattern of bits - except the NSA. And they chose to keep it to themselves because they felt they could use it to spy on people. That's the story as I've seen reported. Feel free to correct me if you have other data.

Note that “automated updates” (a la Windows Update) are not a solution. Unlike consumers, most organizations around the world spend months retesting Microsoft patches after they are released in order to make sure they don’t break compatibility with business critical applications, then they spend several more months rolling them out through their complicated networks of thousands of servers. The very same corporations and entities who are the slowest to adopt released security patches are the ones most in need of it, the ones that are highly regulated, fairly antiquated in their processes, and entirely unprepared to deal with a global security event of these proportions.

To me, this is the last nail in the coffin of onprem shrink wrapped software and the reason more and more services will move to a SaaS delivery model. I’ve blogged multiple times in the past about the public cloud several times (on private vs. public clouds, on the death of onprem infrastructure and its rebirth in the cloud, and on the architectural advantages of the public cloud). I hope WannaCry will serve as a wakeup call for all those continuing to depend on onprem shrink wrapped software.

Much will be written about this event and how it could have been avoided or more quickly remedied. But the real answer is much simpler than all that, so I'll spell it out. We didn't know what we didn't know back then. You are likely to continue to find more bugs - not just in SMB, but also in the millions of lines of code written in all the operating system software written over the past few decades that is running our businesses today. And the juiciest bugs will be hoarded by hackers and used to wreck even more havoc on our systems. The real problem is that this is a broken model of service delivery as it relies on local system administrators or worse, government bureaucrats, to decide when to install a patch. We’ve just seen an example of what that means in real life.

So the hackers will keep finding the bugs, knowing that inertia is in their favor. And they will hide it from others - so they can weaponize it, so they can monetize it, so they can benefit from it. Think about that. It's human nature. And we are all in denial of it. The motive - industrial, government, or criminal espionage - is almost secondary in nature.

The days are gone when it made sense to have so much device specific code running onprem. The pipes are so much fatter and faster these days that the same services can be offered much more securely from the cloud. The more code you have on your system, the more “attack surface”. The more compability you offer with legacy systems, the more successful you are as a platform with the onprem software delivery model, the longer the tail of companies that will be at risk of exposure for years to come. As an industry, we figured all this out a while ago and moved to the cloud as a much more robust and supportable service delivery model, but the rest of the world hasn't caught up with that model yet. They're still running 1990’s era software. Legacy is a bitch.

We can sit here and blame Microsoft but that would be a mistake. It's true that every one of the thousands of eyeballs that looked at that particular piece of code didn't notice that it would misbehave in a peculiar way when handed parameters that it was never designed to handle. Some smart kid somewhere figured it and it became weaponized. Trust me, there are many other such pieces of code out there. You and I and the rest of the world will pay the price for the next two decades, guaranteed. That's how long it takes to replace these systems in regulated industries. Did I mention that this particular version of the protocol was officially deprecated by Microsoft four years ago exactly because it was known to have fundamental security flaws in the design? Not that it matters. As became obvious last week, hundreds of thousands of businesses were still depending on it to run their applications.

The cloud model of service delivery, where the vast majority of the code runs in the cloud and is always up to date and the most recent version, conceptually bypasses all of these operational problems. If the code is running on our servers in the cloud, instead of on your servers onprem, it's so much easier to patch problems quickly before they become a liability. And trust us; we know how to better manage and patch and upgrade the servers running the code. Better than you, Mr. Hospital in the U.K., anyway.

Fundamentally more coherent and elegant architectural solutions have evolved over the past two decades that cleanly address most, if not all, of the security concerns we deal with every day in an enterprise context. Yet we continue to rely on twenty year old technology and complain vociferously as it fails to stand up when measured against our latest requirements and innovations. Continuing to run ancient software in today's hyper-connected world is akin to riding a horse and buggy down the freeway, complaining that it can't keep up with the neighbor’s latest Google controlled autonomous vehicle, and blaming the poor horse when its knees buckle under the pressure.

If you think your particular application isn't offered over the web a service, I urge you to do another google search. Meanwhile, depending on software designed thirty years ago, implemented twenty years ago, and deprecated ten years ago to run your business and trusting government bureaucrats to know when and how to maintain those systems is a recipe for disaster. It is naive and it is irresponsible in the world we live in.

WannaCry is just the first of many. There will be more and they will be worse. I'm sure of it.

Friday, May 12, 2017

Lessons from a Lengthy Career Masquerading as Career Advice

“It is part of the human condition that we are statistically punished for rewarding others and rewarded for punishing them.”
Danny Kahneman. Nobel acceptance speech, 2002.

“In the long run we are all dead. Economists set themselves too easy, too useless a task, if in tempestuous seasons they can only tell us, that when the storm is long past, the ocean is flat again.”
John Maynard Keynes. A Tract on Monetary Reform.

“I feel like a frozen steak thrown on a hot grill. I feel nothing, but suffer anyway.”
Francois Cluzet. The Intouchables.

Hindsight is 20/20. I didn't know any of this as I was going through the experience. You won't either.

It would be fair to say that I'm proud of my thirty five year career in the computer industry. I've had the pleasure of working with thousands of brilliant people and, thankfully, have learned a little bit from each of them.

I was a sophomore in college before I took my first computer science class, at the recommendation of an uncle who thought “this computer stuff is gonna be big.” At the time, I was studying psychology. I have no idea why. It was the best I could think of for a major when I entered college. I was still only fifteen at the time and had no freaking idea what I wanted to do with my life. I was breezing through college and was completely bored with psychology. It seemed like mental masturbation: just putting labels on people and on sets of vague symptoms. The fact that a single mental patient, when visited by five psychologists, will walk away with six diagnoses, is sufficient proof that psychology is more an art than a science.

In the midst of all this, my first computer science class was a revelation. What? You mean there's only one right answer to the problem? You mean the computer will do exactly what I tell it to do? And if the code doesn't work, the problem is likely to be my own damn fault? Fuck, yeah! Here's a world that was much more satisfying than the vague world of psychology. So I did what every decent sixteen year old would do. I declared double major: Psychology and Computer Science. What the hell do those two topics have to do with each other? Nothing really. I just happened to have already taken most of the classes I needed for a bachelor’s degree in Psychology and wasn't about to just give up on that!  In the end, I graduated at seventeen with both degrees and entered the workforce.

Funny enough, having now managed thousands of people and worked with tens of thousands of others, I find myself remembering many lessons from those psychology classes. Now they make sense, now that I've seen dozens of examples of each symptom. Back then, I had no context. I hadn't experienced enough of life to have a frame of reference. As such, the concepts seemed just like a bunch of empty words.

At the time, I was a starving foreign student on an F-1 visa and my only path to permanent residency was to get an employer to apply for a green card for me. But here’s the catch. You can only work for a year after graduation on what is called “Practical Training” in the US. If you do a great job during that year, your employer applies for you, you get an H-1B visa which is then a path to a green card, citizenship, and the rest of the American Dream. If that doesn't happen, you're out of luck and you go back to your country of origin. I was not interested in going back to a country suffering through revolutionary turmoil and a pointless war.

So I desperately needed a job and a sponsor. I ended up taking a job at my local state university as a Computer Science lab manager. What a bizarre job for someone trying to break into the industry as a software developer. Well, that's the best I could do at the time. This is 1982 we’re talking about, after all. Height of the Iran hostage mania, the Iran-Iraq war, the oil crisis, and all that.

I won't bore you with the details. It was not a pretty picture. Here I was, fresh out of classes programming the latest model PDP-11 and Unix, having done Artificial Intelligence classes writing code in LISP and Prolog, having studied heady theoretical automata theory... and you want me to do what? Load these trays of cards into this 1960’s IBM card punch reader and change dishwasher sized disks on aging VAX systems? And this will get me a green card? Okay, I'm game. What the hell.

It's only now, thirty five years later and a million miles away, that I'm actually thankful for having gotten to experience an entire generation of computing. One that was dying, for sure, but also one that allows me to contrast today's world even more starkly with where we were, just a few years ago.

Just think about it. Any kid can pick up a smartphone or tablet today, type in a question, and get an instantaneous answer to any question. Wow. Just fucking wow. Back in my day (can you hear the violin playing in the background?), we still had to go to the public library and use printed index cards to find reference books. Do we even realize how far our world has come in just the past few decades? Fast forward fifty years, at the exponential rate we have been experiencing, and you will see how far we will go. I'm an optimist about the future if only because I've seen how fast this industry can move in the long run.

Don't get me wrong. In the short term, it's nothing but frustration and tedium, bureaucracy and cat fights, bug fixes and meetings. But in the long run, oh my god! Just take a big step back and look at how dramatically we have changed the human experience in just the past ten or twenty years. I am a child of the sixties and seventies raised partly in a third world country. I still remember having to go to the national phone company office downtown in order to make an International phone call. Today, anyone can connect with anyone anywhere on the planet instantly through voice, video, email, and social media. Holy crap. Now that's progress.

Of course, I didn't understand any of this at the time. I was just struggling to keep up with some of the best folks in the industry. It’s only now that I see the consummation of all those things we worked for over many years: the networking and security standards, the operating system platforms and ecosystems, the advances in usability and interoperability, reliability and scalability. I still have trouble getting my iPhone to work with Google Play when I visit a friend’s home but we can choose from thousands of movies, millions of songs, and dozens of shared experiences whether we’re in the same room or halfway across the world. Now that’s progress and we all had a hand in it. It’s only when you take a giant step backwards and see that impact our industry has had, as a whole, on humanity that you can feel happy about your contributions.

It took me three or four tries, at half baked startups and mediocre companies, to finally end up at a company where I could work on something I was passionate about. I spent a few years writing device drivers on Sun workstations, then did a lot of Unix kernel work at a multiprocessor high end server company. I got to work with every architecture from Motorola to MIPS to PowerPC, writing system components, device drivers, low level kernel code, system bring up, even soldering parts on the factory floor when needed.

I eventually made my way to the west coast and spent several years at MIPS and Silicon Graphics working on high end server systems. At its height, I worked on several supercomputer projects at Silicon Graphics. When I tell people that, they immediately say: “Ah, Jurassic Park!” Well, yes. But we also worked on supercomputers that competed directly with Cray Research for supremacy in the (then highly competitive) supercomputing world. Those were the heady days when I learned everything about computer architecture from the processor all the way up to operating systems and system software in general.

I seem to have worked on a lot of dead system architectures. Supercomputers, UNIX workstations, shared memory architectures. All architectures that have mostly fallen by the wayside as the world has embraced personal computing, the cloud, and distributed computing. I used to fret about this. Why was I always working on these herculean projects only to find out a few years later that a competitor had completely rethought the problem space and come up with a new generation of computing to address it? It was only later that I realized: that’s probably true for almost everyone out there. Every architecture dies out sooner or later. That’s just the way this industry works. I’ve worked on many revolutionary projects - revolutionary when I was working on them - and every one of them has sooner or later been retired to the dustbin of history. Thankfully, each generation learns from the mistakes of the past.

In the process, I also got an opportunity to work with some of the brightest minds in the industry and learn from them. The most important lessons took me years to learn. I was the angry young engineer who quit when my PowerPC based project at NeXT was canned. Steve Jobs tried to keep me at the company but I was too hot headed and angry to realize that he had made the right call. He had realized that the battle over processor architectures had ended. It made no sense to compete with Intel by building a PowerPC based system. He completely killed all Hardware projects at NeXT and gave the company a software-only focus. I, of course, stormed out the door. Because he had dared to cancel my project! I was too busy looking at the trees in front of me to see the forest. The processor war was over. The answer was to move up the stack and innovate in software, not to keep fighting the processor war for an ever-shrinking slice of the market. Of course, he then returned to Apple with the NeXT team intact and the rest is history. That's what I mean when I say the hardest lessons take years to internalize. Hindsight is 20/20. I wasn't thinking at that level. I was too emotional about the project I’d just spent so much time and effort on. I couldn’t be bothered to take a step back and look at the bigger picture. What I learned from Steve, later - much later, after I had cooled down - was to fight the right battles. Don't keep fighting the battle if the war has already been lost.

Later, I spent a dozen years at Microsoft working on various versions of Windows. Now that you look back on it, you know that Windows lost the PC operating system war and the mobile phone war to Apple, the server war to Linux, and the cloud war to Amazon. Back then, we were too busy pumping out versions of Windows to realize that. It's so hard to put into words the amount of organizational inertia that goes into an engineering team responsible for a successful platform being used by billions of people. They almost never see the disruption coming at them. Or if the leaders do, the rank and file don't. Most of them are too busy pushing the current rock up the mountain.

This is not a complaint about the leadership of Windows or of Microsoft. By the end, I was one of those “leaders”, responsible for all core development in Windows 7. I’m proud of what we did as a team - even with all the warts it entailed. What I learned from Microsoft was how hard it is to build a successful platform that is used by billions of people, millions of apps, and thousands of companies. The more open you make it, the more programmable you make it, the more people that build stuff around it, the harder it is to innovate on that platform later.

What I learned from Bill Gates was an amazing attention to detail. The man could sit through fourteen hours of non-stop meetings where one team after another would prance through, covering topics as divergent as operating systems, productivity apps, the internet, watches, games, research, email, databases, browsers - you name it. He could drill down into the details with the best of them. Impressive mental capacity. What I learned from Bill later, at a distance, was that he was also a decent human being. He could take that brain and apply it to solving much harder problems - education, poverty, disease.

I can sit here and write yarns about what I learned from each of the smart people I've worked with over the years. That would take far more time than either you or I have and take up more pages than either of us would care to read - or write. More importantly, it won't mean much unless you experience it for yourself. Most lessons are lost on us until it's too late for them to have an impact. What I can tell you as a piece of career advice is to work on things you care about. As long as you’re learning, keep at it. There is so much to learn and this industry moves so quickly that you will fall behind if you stop running even for an instant. As long as you're running in the right general direction, I used to tell people, it's all good. Don't try to plan out your entire road trip from New York to LA before you start out. Just make sure you're moving in a generally westerly direction, keep running. And keep learning. You'll eventually end up in the right place; and you'll have a lot of fun on the way.

Update: I guess I wasn't quite done with this topic. I received several comments on this blog post, all of them positive. Thanks for reading and thanks for commenting. Someone asked why I had neglected to mention "the team" as a motivating factor for sticking around at a job. Isn't that just as important? In fact, isn't it a little self-centered to worry only about whether you're learning on the job?

Yes, of course the team is important. I've enjoyed working with some very strong teams. I've worked on teams with amazingly strong leaders who intellectually challenged their teams, showed them a crisp clear vision to follow, and worked alongside them in the trenches to reach that goal. I've also worked on teams that were ensemble casts, no single member significantly stronger or weaker than the others and yet each an expert in a part of the problem space. When a team really gels, it can do amazing things that no individual team member could have dreamed of accomplishing on their own.

But "the team" is a double edged sword. The same dynamics that make a team strong are also the ones that can tear it apart. Once a team turns negative, either because some team member is unhappy and dragging everyone else down or because the team is demotivated as the company struggles in the marketplace, it's hard to get momentum back and re-energize the team. I've seen very few teams come back from such death spirals.

So, what's the one factor that keeps a team positive, all marching towards the same goal? I've found, for me, that one thing is learning. As long as each team member feels he or she is learning a new skill, there is harmony and progress. As soon as someone feels their time is being wasted and they're not learning on the job, either because there is too much bureaucracy or because the leadership won't listen to their concerns or because they feel others aren't pulling their weight, they start sabotaging the overall team goal. Either they start complaining and gossiping or they quit in disgust and make everyone else's job so much harder. Or worse. They stick around and make matters even worse. Believe me, I've been there - and it ain't pretty.

When I think back to all the companies I've worked for, all the teams I've worked with, and all the projects I've worked on, I ask myself: When was I happy? When I felt I was learning something new. Translation: Leaders should push team members beyond their comfort zones, push them to learn new skills, new technologies, new frameworks, some new part of the business. As long as they are learning something new, they will be happy. As long as they are happy, they will be positive. As long as they are positive, the team is more cohesive and successful. So it's not just about "the team" but about how you motivate that team. It's only if they have to sit in the same boring meetings every week listening to the same boring agenda, reviewing the same five metrics, fixing the same bugs, filling out the same forms that they will turn negative on you. You're not challenging them enough mentally.

The answer may be different for you. Perhaps you're staying in your current job because you need a paycheck every week or can't find another job. I've been there. I know. It's not fun. Here's the only advice I can give you: Don't run away from things. Run towards something. Don’t run away from problems at your current job. Run towards something you’re passionate about. Chances are that the grass is not greener on the other side and that the new company will have many of the same problems you have in your old job. Every time I've changed jobs because I was fed up with a company - for whatever legitimate or imaginary reason - and ended up taking a job for the wrong reasons (more money, bigger title, you name it), I ended up regretting the decision. Every time I took a job because I was damn impressed by the people I talked to, I thought the team was strong, I thought the technology was cool, I thought I could learn something in the process, I ended up happy - both then and later when I’ve looked back on the experience.

Friday, April 28, 2017

Is Life Just a Simulation? Musings on Free Will and Determinism

What lies at the heart of every living thing is not a fire, not warm breath, not a 'spark of life.' It is information, words, instructions... If you want to understand life, don't think about vibrant throbbing gels and oozes, think about information technology.”
Richard Dawkins. The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe without Design.

“Seconds after fertilization, a quickening begins in the embryo. Proteins reach into the nucleus of the cell and start flicking genetic switches on and off. A dormant spaceship comes to life. Genes are activated and repressed, and these genes, in turn, encode yet other proteins that activate and repress other genes. A single cell divides to form two, then four, and eight cells. An entire layer of cells forms, then hollows out into the outer skin of a ball. Genes that coordinate metabolism, motility, cell fate, and identity fire ‘on.’ The boiler room warms up. The lights flicker on in the corridors. The intercom crackles alive.”
Siddhartha Mukherjee. The Gene: An Intimate History.

“There is no traced-out path to lead man to his salvation; he must constantly invent his own path. But, to invent it, he is free, responsible, without excuse, and every hope lies within him.”
Jean-Paul Sartre. The Last Chance: Roads of Freedom, Volume 4.

To you, I'm an atheist; to God, I'm the Loyal Opposition.”
Woody Allen. Stardust Memories.

There are two main schools of thought out there when it comes to our understanding of human nature and its capacity for free will. The question, if I may oversimplify, is this: Do we, humans, make decisions based purely and mechanically on pre-programmed inputs or do we have the ability and the freedom to make our own choices?

Philosophers and theologians have grappled with this question for millennia, verily twisting themselves into pretzels trying to justify the existence of free will, believing that life as we know it would be too bleak without it. A religious person might argue that God has already written our destiny and knows what we are going to do; everything is predestined. Scientists are relative newcomers to this ongoing debate. A scientist would say we are just bags of chemicals interacting with each other and the environment, that our actions are the result of our genes and purely physical external criteria. There is no “soul”, no “I” other than a collection of algorithms pre-programmed into our brain through thousands of generations of evolution.

I'm an Atheist and, I'd like to think, a scientist. As such, I don't believe in the concept of God and all its related mythologies. I have five senses and everything I ever perceive in life is learned through those five senses. Of course, I understand that the universe contains “data” that I cannot perceive through my five senses. I understand that there are other senses as well; the bat’s sonar is a great example. But the fact that I can't sense everything is no reason to believe in a God that watches over us and pre-programs our every step. That's a giant leap I'm not ready to make. As Christopher Hitchens famously said (I'm paraphrasing): As an atheist, I'm not saying God doesn't exist. I'm just saying I haven't seen a single piece of evidence to prove his existence. I am perfectly willing to change my mind. Show me the data. The burden of proof is on the believers to show the data supposedly at their disposal to prove the existence of such an entity. I'm listening. I haven't seen anything yet. And - yes - I am limiting myself again to what I can ascertain with those five senses, not on some fictional belief or dogma. So, as you can guess, I don't subscribe to the religious view on this topic.

Much recent research has shown massive evidence for the scientific point of view. Scientists have shown, definitively, that the seconds or minutes spent contemplating our choices are simply an attempt by our brains to rationalize a decision that our primitive brains have already made almost instantaneously. You decide whether you love or hate Trump instantly, then you spend all your time convincing yourself that he really is a jerk or our savior. The next few months - or years, as the case may be - are really just mental masturbation, time spent confirming our pre-existing biases - again, based entirely on chemical and neural impulses in the brain that our conscious minds do not control. This post hoc rationalization, along with our confirmation bias, is what we really think of as free will. The concept is just an illusion, a lie our brains tell us to make us feel better. It's a story we tell ourselves to make our selves feel better. We have no choice but to behave the way we do, to make the decisions we make.

I agree with everything science has shown. Every decision I ever make is heavily influenced by my genetic makeup. This has been shown again and again through scientific studies of fraternal and identical twins. But every situation I find myself in is also unique and has never been experienced by anyone else before. Even the other people experiencing that same moment with me have entirely different backgrounds which, by necessity, means they have a very different experience of the moment than I do. If we think of this moment in time and space as the culmination of everything that has happened to the participants in the moments leading up to it (the scientific view: A caused B which then caused C, all the way back to the Big Bang), then each moment is unique not just in itself but also in its interpretation by each of the participants. There is no single “now” but instead, there is “now as experienced by Jack” and “now as experienced by Jane” and everyone else.

The choice I make at any given moment is, of course, massively influenced by everything that has come before it, every experience I have lived through, and every gene I've inherited. But the moment itself is unique and has never happened to anyone else before - in history. My actions may be automatic but the combination of all our actions together is not. You don’t know what I’m going to do next and I don’t know what you’re going to do either. That, in itself, introduces probability into the mix, making our combined future together nondeterministic. I may just be executing the next inevitable step in a program but that program has never been executed before nor will it ever be executed in exactly the same manner again.

Some people have even suggested that life is just a simulation - a proverbial Sims game writ large. These types of explanations are interesting but don’t really get to the heart of the problem. It only looks like a simulation because that is the metaphor we are familiar with as children of a certain age. To say that life is a simulation is no more meaningful than saying that it is created by an invisible yet omnipotent omniscient being. It avoids answering the real question by assuming the existence of a creator, in this case the programmer, conveniently placed outside our “world”.

Now, here comes the pretzel: The more satisfying explanation, the one I choose to believe, is that we are truly creating every moment on the fly - one moment at a time. We are, in that sense, the creators of our own destiny. We are writing this story. We are truly making it up as we go along. Because this particular moment has never ever happened in the past. And there are at least seven billion versions of “this particular moment”, seven billion “stories”. Each of us may just be playing out preprogrammed actions at each step, but the combination is unique and new. And the more people, the more relationships, the more ideas, the more variables, the richer and the more unique each moment. Isn't that enough?