2/11/2020 by jdabulis

Krista Doesn't Code With Her Face

If you’re likely to be entertained or informed by an article that turned out to be something of a rant against companies acting with negligent disregard for privacy and security, by all means, read on.

Shadow Inc.

Krista Davis is CTO and Chief Architect of Shadow Inc., the developer of the creatively named IowaReporterApp-- the app which was supposed to power reporting the Iowa Democratic Party Caucus results.  In case you were wondering (and Krista sets the record straight on her LinkedIn page) she in fact codes with her brain and not her face!  Of course, in hindsight this baseless claim seems like an admission of a missed opportunity or an otherwise disappointing factoid, as the terrible quality of this software and its effects became fully known.

Without even getting into a discussion of the politics surrounding this software, which is in fact equivalent to a pile of poo and its politically opinionated and motivated developer, I think it’s uncontroversial to say that every high-profile technical fiasco which threatens confidence in democratic processes, or makes a government service unavailable (e.g. healthcare.gov), or that leaks personal Facebook data, worries users about their privacy and makes them lose trust in adopting new technology.  It seems that nearly every day some new story reporting an exploited security hole appears.  I care about the victims of terrible software.  I also care that these high profile failures could make investing in new mobile application development a harder sell.

Now, I can only take credit for having conscious biases since by definition, I'm unconscious of the unconscious kind, though our iconoclast Krista might still think I’m some kind of "enabler."  Let it never be said that her terrible software enabled anyone.  And so too is LinkedIn an "enabler" with its unwoke personalization features.  It turns out posting profile photos has been enabling all of us to associate faces with names for years (with or without any brains behind whatever we’re up to).  I must say, I'm less convinced now that whatever unconscious biases might be afoot are useless or unwarranted.

The problem is that the complete failure of this software application is not just an unhappy event that we should get used to.  As our society depends and trusts ever-increasingly on technological innovation to automate our cumbersome yet important activities, this was an unmitigated, unnecessary and blameworthy disaster. 

Shadow Inc. is not sending their best.

A larger point that can be made is that reliable software development takes more than a great deal of technical knowledge and experience.  It also requires clarity and precision of thought.  I can’t imagine how unnatural it must seem for a person having, let’s say, an absolutist worldview (or one quick to misjudge and label innocuous actions as wrongdoing) being able to exercise objective and dispassionate technical leadership, given the many critical decisions that need to be made correctly in order to pull off the delivery of a high quality software product.

It’s true that not every decent software developer needs to be a deep thinker with a habit of introspection and an active interest in philosophy.  But they should exhibit at least a modicum of common-sense self-awareness.  I myself have worked alongside many fine engineers, sometimes surprised by their inconsistent personal views, or their beliefs which I find incompatible with what I take to be objective reality.  I’m not usually worried by this; it's something of a trend to reject the notion of objective truth and reality.  Some people are simply eccentric.  But in my experience, these have been talented, highly-educated, serious people who are able to distinguish their emotions and feelings from consensus facts, and channel their better judgment.

But it’s also true that not everyone so described can be successful at maintaining a necessary wall of separation between emotional, fantastic and chaotic thinking on the one hand, and objective rationality on the other.  There are many reasons a software project can fail.  One wonders whether some failed software projects could be the symptom of an irreconcilable chasm existing between sincerely held, yet inaccurate beliefs about the system and its environment (the model), and a more veridical assessment.  Wishful thinking and confirmation biases can also blind one to noticing otherwise obvious risks.

If things go horribly wrong with securing your personal data, would a perfunctory apology be all you expect to address that glaring exhibition of incompetence, negligence or mischief?  A bit of easy damage control and the whole thing will be memory-holed by the next news cycle?  I generally find an apology to be an unsatisfying and unsuitable reaction to serious errors and negligence; I can't understand why people are constantly demanding apologies.  Apologies accomplish nothing, except to the emotional.  I'd rather have a permanent correction of the problem, such that damaged parties are made whole, and confidence has been restored.

Elections are nothing to mess with.  Until quite recently, the United States enjoyed an electoral process that yielded complete and uncontested results with confidence.  Typically absent were the problems and "irregularities" that we know from the history of other countries.  These problems have lead to violence, insurrection, and even civil war, especially when there is a suspicion of corruption.  Yet, there are time-tested ways preventing all of this, by securing applications and data.  Security checklists are available for anyone who will invest the effort to implement them.  There is no excuse for amateur-tier security flaws in any application, let alone one used to collect and transmit election results.  This is the last thing a society of increasing political polarization and fading trust in flawed political institutions needs. 

But if it were just a threat to the constitutional right to fair elections, that would be bad enough.  Unfortunately, we are continuously reminded that banks, insurance companies, retailers, and social media sites are demonstrably putting our finances and personal data at a risk of fraud and exposure.  A poll from last year showed Microsoft to be the most trusted technology company.  Why is this?  Because Microsoft is not perceived to have an overt political agenda.  Bing and DuckDuckGo don't protect you from finding the most relevant search results.  And it didn’t help that Google quietly dropped its “Don’t be evil” slogan a year before that.  Everyone has opinions.  Some opinions are unpopular.  There are worse things that could be done in free country by private organizations than letting people freely express them.

Even if you are a small business-- especially if you have a small business, whether your application stores personally identifiable information, authorization credentials, credit card information, or other sensitive data, I ask you to demand that the application developer show evidence of code quality and proper testing to protect your reputation, and ensure user confidence and privacy.  Companies failing to responsibly safeguard privacy and data security not only put their own reputations and the privacy of their users at risk, they may well put the order of our society at risk.

© Lognosys LLC. All rights reserved.