Driverless Cars? Not So Fast

Driverless cars (DCs).  The concept is everywhere these days, and according to many futurists the actual cars soon will be, too.  Every tech outfit worth mentioning has a finger or two in the DC pie and several big-name consortiums already have prototypes rolling.  A few cities have okayed test programs, and Colorado legislators, ever alert to the chance to lure more technology dollars to the state, are proposing a friendly set of regulations designed to make our admittedly deteriorating roads more attractive to robot rides.  A bill being considered in the legislature would set state standards for testing of DCs and preempt Colorado cities from enacting more restrictive rules (Boulder, for example, is rumored to favor allowing only electric vehicles; each powered, one might suppose, by its own wind turbine).  The most optimistic press releases have the technology highway-ready in five years or less.

As a lifelong technophile, I’m usually excited about this kind of forward leap.  But as a pragmatic Dilbert-type, experience has taught me to be skeptical of anyone touting some revolutionary breakthrough that will forever change the way we get around.  After all, I’m still waiting for my flying car.  So let’s take a clear-headed look at the promise of the DC.

State Sen.Owen Hill is one of the sponsors of the Colorado DC initiative.  In a recent interview, Hill hyped the safety angle of going driverless, allowing that DCs could prevent nearly all of the 40,000 highway deaths that occur each year by “removing  human error” from the equation.  This kind of statement could only come from someone blithely ignorant of the level of complication involved in building such a system, one created, initiated – and debugged – by humans.  The human error factor won’t be eliminated.  It will only be moved to another part of the process.  And humans, at least when it comes to driving, are a lot more capable that we get credit for.  Navigating an automobile through the real world is an extremely complex and variable undertaking, but 99% of human drivers handle it well 99% of the time.  The explanation for that is our ability to learn from experience.  Every mile we drive adds to our experiential library, and we have the unique ability to not just remember events but to absorb and reconstitute them to meet and deal with new, unprecedented situations.  It’s called intelligence.  We have it.  Computers, at least for the moment, do not.

So what is called for in the DC is AI.  Yep, artificial intelligence: the Holy Grail of technology, enabling a computer to do what 16-year-olds with learner’s permits have been doing since the 1920’s.  When we drive a car, we (most of us, at any rate) think and reason.  Computers in DCs execute their programming.  Staying in a lane, keeping a safe distance from the car ahead, stopping at red lights (a novel concept), avoiding old ladies walking their Pomeranians, all these actions are relatively easy to program.  Experienced drivers do them reflexively, often while texting, digging a Tic Tac from between the seat cushions, yelling at the kids or applying mascara.  But sooner or later a situation will arrive that demands who or whatever is in control of a vehicle to make a split-second decision based on maybe 10 data variables.  If the programmers have missed even one of these, tragedy will ensue.  A human driver is equipped to take in all 10 and respond effectively.  Computers are not there yet.

A completely computer-driven transportation system would be an unprecedented achievement, but we’re not talking about that.  We are talking about a construct which will have to accommodate and integrate with human-piloted vehicles far into the future.  DCs will be mixed with UPS trucks, yardcare pickups towing trailers, buses, street sweepers, snowplows, bike couriers and 20 years’ worth of used Civics and Corollas, all being driven, with varying degrees of incompetence and distraction, by humans.  DCs will undoubtedly be programmed to provide utmost safety for their occupants, so they will take no chances in traffic.  Real drivers will soon recognize, and try to take advantage of, this tendency.  Imagine navigating rush hour with a few thousand super- cautious robot drivers clogging traffic lanes and an equal number of super-aggressive flesh-and-blood dolts doing their best to get ahead.  Think that will make for error free mobility?  The only way to totally eliminate human error is to eliminate humans.  In other words, every vehicle on every street and highway not just under the control of its onboard computer but in constant contact with every other vehicle anywhere nearby and with a central traffic management system keeping track of the whole mess.  Imagine a cross between a beehive and a Mars landing, all with a couple levels of redundancy.  Such a system would use up most of the servers on the internet just to run Denver.

But assume for the sake of argument that we can create this system.  Who will pony up the bucks to retrofit the millions of used cars already plying the streets and alleys of the country with tech that may cost three times what the cars are worth?  What happens when a hailstorm wipes out the sensor arrays of several hundred cars while they are rolling down the Interstate?  Not to mention vandalism.  And who is Frank Azar going to go after when, not if, an accident happens?  The owner of the car (who was in the back seat doing the Times crossword)?  The manufacturer?  Google?

These are some, but by no means all, of the problems to solve and the policy issues to decide, before we can become a society free of fender benders and road rage.  DCs in their beta form may be on some roads, somewhere, maybe even in Colorado, within the next five years or so.  But the day when you can climb into a true DC that will take you anywhere you need to go quickly, cheaply and uneventfully is probably a lot further away than prognosticators would have you believe.

Which is fine with me.  I like driving.  Besides, it’d be no fun flipping the bird at an empty driver’s seat.

Third and Final [?] Phase of America’s Civil War

Phase 1 of America’s Civil War was a horror – the number of soldiers who died from a combination of battle and illness was over 750,000, “far greater than the number of men who perished in all other U.S. wars put together.” Ecstatic Nation

Human beings are complex creatures and many things drove the war, but slavery was at its core – in the new states of the west as well as the old south.

After such a terrible war, the North was willing to turn towards commerce and away from black citizens. Today, we might call the Klu Klux Klan and Jim Crow an insurgency – it certainly was violent enough to qualify.

There was a huge riot in New Orleans, which really turned into a massacre against the black community in 1866, and then there were acts of mob violence against black voters. And in broader Louisiana, you had some of the worst political terror and mob violence committed in all the Reconstruction years, most famously the Colfax massacre of 1873, which was the largest mass killing in American history until 9/11. Isaac Chotiner slate.com

Gradually the violence decreased (though it never disappeared) and a new normalcy settled on the backs of black Americans. Many whites in the defeated South began to “write magnolia-scented history” where Lee was nobler than Grant and Confederates were finer men than Unionists. In an exception to the common view that the victors write history, the South was fairly successful in their efforts. Ecstatic Nation

Phase 2 launched a hundred years later with the Civil Rights Movement– there was more violence but also more progress towards a fair and democratic America. In the mid 1970s, society settled down again – another new normal.

Perhaps we are entering Phase 3 after only forty more years. Continue reading

Is There a Bee in Your Bonnet?

I heard this phrase on a recent cable news show and it struck me as rather old-fashioned for TV

Idioms says

The phrase has been around since the 1500’s… The Old_woman_in_sunbonnet_by_Doris_Ulmannliterary origin is seen in Alexander Douglas’s worked titled ‘Aeneis’ which was published in the year 1513 but it was not exact. In 1790, Reverend Philip Doddridge’s ‘Letters’ cited the phrase as it is used currently.

The post adds that the phrase once seemed specific to women and for men the variation was “bee in your head.” Apparently, as with “author” and “actor” modern English is losing some of its gender distinctions. That’s probably a good thing when it’s not awkward. I’m a volunteer firefighter (not fire woman or fire person!) and a foreman can be a crew leader. Hurray for a living language.

Book Review:  Concrete Economics, by Stephen S. Cohen and J. Bradford DeLong

This isn’t a normal book review, because I’m not going to plug this book.  More the opposite, actually.

Concrete Economics is in most respects a traditional work in that field.  For openers it is literally a cure for insomnia.  I read at night before bed – in bed – a practice most sleep experts say is likely to ruin your nocturnal regimen, and I have had more than a few nights “ruined” by the likes of Michael Lewis and Oliver Sachs.  After a few pages of Cohen and DeLong I was usually more than ready to turn out the light.  At 170 pages, this tome that should have been a one-sitting read for me took nearly a month to finish.  Even by the standards of the Drear Science, this book was a slog.

The authors’ style is heavy with one paragraph sentences, multi-syllable words and overly formal vernacular.  For example: “The East Asian Economies were eager to build up their manufacturing capacity and capability, and our ideologically motivated redesign of the American economy told us that we didn’t really care, because we didn’t really want those sectors.”  Did I mention the mixed metaphors?  “Alexander Hamilton: the only individual who may have been more than the tip of the spearhead of the heavy shaft of an already-thrown, near-consensus view on pragmatic economic policy.”  I nearly fell asleep typing that.

The book came highly touted by Paul Krugman, Nobel laureate and oracle of Progressive economics, so I expected to disagree with much if not all of the content.  I was (mostly) wrong.

Lost style points aside, the authors make a fair case that some economic planning by the Federal government is essential to the success of the Republic (italics mine, certainly not Krugman’s).  Early on they discuss the formative post-Revolutionary policies of Hamilton, who pushed the fledgling US government to assume the colonies’ war debt, establish a central bank and, most crucially, pass a series of tariffs designed to protect America’s emerging manufacturing sector.  The only way the country could gain real economic independence, argued Hamilton, was to industrialize. The plan worked well.  US makers were soon thriving and the tariffs protecting them were the government’s main source of revenue until the advent of the income tax.

Hamilton’s protective tariff model was adopted by Japan and, later, China as those countries struggled to join the Industrial Revolution.  Contemporary fans include our current president, who would also appreciate the authors’ use of words like “huger.”  I think they meant, “more bigly.” Continue reading

Society on a Crash Course Over Fetal Rights

Extreme prematurity is the leading cause of neonatal mortality and morbidity due to a combination of organ immaturity and iatrogenic injury. Until now, efforts to extend gestation using extracorporeal systems have achieved limited success.

Here we report the development of a system that incorporates a pumpless oxygenator circuit connected to the fetus of a lamb via an umbilical cord interface that is maintained within a closed ‘amniotic fluid’ circuit that closely reproduces the environment of the womb. [my emphasis] Nature

There have been several articles about this study – I’ve quoted the abstract. Don’t you love science-y phrases? Extracorporeal systems – so specific. Take a look at the pictures on the link – both creepy and fascinating.

As the authors say, in the past “advances in neonatal intensive care have improved survival and pushed the limits of viability to 22 to 23 weeks of gestation,” but at the cost of complications and permanent disabilities.

This current achievement is amazing – using lamb fetuses, researchers got one to survive and grow with normal lung and brain development. Not all the fetuses did so well – there’s a lot of work to do before this device can be used on humans.

But that’s coming.

That’s the report from science – but what about public policy regarding contraception, women’s rights, and abortion?

This issue has been creeping up on us for decades. The once traditional notion that a fetus became a person when it quickened in the womb (an event that the mother needed no technology to discover) has long since been replaced by various measures of viability with various degrees of scientific support. Such hair-splitting will disappear when an artificial womb is developed – if not from the research quoted above, than from others. And soon.

Science may inform the debate, but it can’t solve our policy problems. Now is the time to discuss what we, as a society, should do. I don’t want to chase the threshold for abortion backwards through pregnancy. All that will do is entrench and enrage existing opinions.

There’s a lot to think about: Continue reading

Cut and Dried

This phrase, which means something is clear and beyond dispute or settled in advance, struck me as another hay farming phrase. After mowing/cutting hay, it is left to dry (moisture content is important, so this step takes skill) before it’s ready to bale.

Wiktionary has my guess in mind, saying the phrase comes from herbs being cut and dried for sale, rather than fresh. I suppose that implies the herbs are stable and not going to change – at least, that’s my guess on how the literal and figurative meanings connect.

The first citation of the expression, which must have already been in use or it wouldn’t make sense to use it this way,

is in a letter to a clergyman in 1710 in which the writer commented that a sermon was “ready cut and dried”, meaning it had been prepared in advance, so lacking freshness and spontaneity. worldwidewords

Many people mishear the standard expression as ‘cut and dry.’ Although this form is listed in the Oxford English Dictionary, it is definitely less common in sophisticated writing. The dominant modern usage is “cut and dried.” public.wsu.edu

In Search of Settled Science

The media coverage of last weekend’s March du Jour, this one supposedly a celebration of Science (capitalization mine), portrayed the event as just that – celebratory.  But when Progressives get together carrying signs it almost always means a demonstration, and this gathering was as much a vehicle for the Left to chide conservatives about their refusal to accept the “settled science” of human-caused climate change as it was a paen to Science itself.

Watching the festivities unfold, I thought of a recent commentary by Vincent Carrol in the Denver Post.  He reported that Boulder County Commissioners had just voted to ban the growing of all genetically modified (GMO) crops on land owned by the county.  This edict will be problematic for farmers who have been raising GMO corn and sugar beets for many years on this leased land because, according to Carroll, there no longer are any non-GMO strains of sugar beet.  The farmers will have from three to five years to eliminate GMOs from their rotations. Case closed.

Here’s the Science rub.  There is no scientific evidence – none – that genetically modified crops are harmful to humans, insects or anything living.  The decision to flatly ban them flies in the face of all the research that has been done on the subject, and will do nothing but cause harm and hardship to the affected farmers, many of whom have tens of thousands of dollars tied up in equipment used to grow and harvest a crop which they can no longer plant.

The GMO ban was met with loud approval by liberal Boulderites, many of whom no doubt paraded last week in unwavering support of Science. In fact, Boulder liberals show the same disregard for GMO research that conservatives hold for the study of man-caused climate change.   Clearly science denial knows no political affiliation.

Why this distrust of science cutting across the political spectrum?  Science is supposed to be provable, reliable, the epitome of fact.  Remember junior high science class, where we learned the basics of the Scientific Method?  Start with a theory – what do you think is happening and why.  Then try to dream up an experiment that proves your theory, or disproves someone else’s.  Compile your results.  Then the most important step; submit your findings to others who will try to duplicate them, using your methodology.  If your experiment can be repeated by others, your “peers”, then and only then are your conclusions scientifically valid.  That’s how science works.  Or used to.

Peer review has been the backbone of scientific investigation since Isaac Newton lounged beneath his apple tree, and the science it produced seemed for the most part apolitical.  These days science methodology is becoming bastardized, thanks in large measure to our newfound reliance on computers and algorithms instead of beakers and Bunsen burners.  For example, our seemingly unlimited capability to gather and analyze massive quantities of data has led to the proliferation of often agenda-driven studies that arrive at their conclusions by asking a large number of subjects a long series of questions under the assumption that a small but publishable number of queries will yield a positive result (i.e., the result the authors wish to see).  This statistical alchemy was used in a study released last year which pointed to an increased incidence of certain types of cancer in communities located downwind from good old Rocky Flats.  More traditional studies have found no such link.  More recently, another megadata study found an increase in dementia and strokes in people who drink diet soda.  The researchers relied on data from massive numbers of soda sippers (full disclosure: I drink two or three cans a day) but somehow failed to correct for obesity and several other possible variables.  Another junior high science lesson: Correlation does not automatically equal causation.

Each of these studies was ostensibly peer reviewed.  But that most vital step in the process, according to many in the scientific community, has become sloppy and incestuous, bowing to political pressures and the “publish or perish” dictum so pervasive in academia.  The problem has become so epidemic, according to a study published last year in Nature, that researchers attempting to replicate other scientists’ experiments were failing to get the same results more than 70% of the time.  More than half the time the results could not even be duplicated by the original researchers.  When the supposedly peer reviewed (and widely publicized) study that claimed to find a link between vaccinations and autism was debunked, the British Journal of Medicine in which it was featured took nearly 10 years to publish a retraction.  That study triggered a public health crisis in Britain and the author was eventually tried and found guilty of gross ethical misconduct and fraud.   In spite of the criminal misapplication of science involved, thousands of American parents continue to cite the study when refusing to have their children vaccinated.  Most of these doting parents are well-educated (and liberal).  So much for the robustness of peer review.

Stories like these invite skeptics of all political lineages to dispute the results of what may be credible, critical studies, and contribute to the ideological fog that is threatening to smother the legitimate, rigorous methodology behind the bulk of science research.  They also infer that there are both liberals and conservatives (and evidently some scientists) willing to bend science to their ideology.  So forgive those misguided wretches who choose to take the assertion that human activity is the primary cause of global warming with a grain or two of salt.

We all want and need Science to be worthy of celebration, but clearly the science establishment has some housecleaning to do.  To regain our confidence those who do science right and proper have to be willing to call out the ones who distort its process for their own ends.  The rest of us, meanwhile, need to improve our science literacy so we can recognize questionable science when we see it, even if it means looking past our ideology.  Best that we reach consensus on climate change, among other headline issues, before the research findings become moot.

Events will eventually settle the scientific disputes that bedevil us.  Hopefully we will survive the proof.