Monday 30 December 2019

Small Could be Beautiful?

Over the past two centuries, the average height of humans in developed countries has increased; microevolution in action, although it can be argued that this effect is more due to improved health care and nutrition than to any significant genetic changes.  One doubts that tall people have more children than shorter ones.  In any case, this shows that it is possible to breed humans for simple changes in stature, just as we have bred dogs and other animals of different types and sizes.  Leaving aside the shadow of eugenics for the moment, I want to suggest that, instead of breeding taller humans, we should breed ourselves shorter -- significantly smaller.

Taller people are not necessarily healthier, smarter or more successful, just as shorter people are not less well off, unhealthy or of lower mental acuity in general.  However, it is clear that taller or bigger people need and use more resources than smaller folk.  Very tall people, aside from being better at basketball, have serious problems with doors, cars, chairs, clothing, and other standard acoutrements of civilization than normally sized people.  Moreover, very tall people and certainly oversized (that is, wider) people have health issues that occur less frequently in smaller (or thinner) people.

We already have some small people with us, those with dwarfism of various types and degrees, often proportionate (sometimes called midgets), for example.  Although they may have some associated health issues, they can live happily, even in our normal-sized cultures.  Now imagine what would happen if everyone was smaller, say about one meter tall, give or take -- child, or hobbit size you might say.

There would be significant benefits to humanity and the world if we were all smaller.  If houses, cars, roads, etc. were designed for people one meter tall, instead of 1.7 m tall (5 ft 6 in, the approximate current average), and were proportioned about the same as hobbits, or normal five year old children, our average weight would be only around 20 kg or 45 lbs.  This would solve or mitigate all sorts of obesity issues at the very least!  It would also mean that we would need less food and numerous other resources to live a healthy and fulfilled life.

A world of hobbits could include at least twice the number of people as our world for the same resource usage.  This would solve the "overpopulation" problem many seem to worry about.  In any case, being 1m tall would greatly reduce your "carbon footprint", not to mention your actual footprint!  Small shoes should be cheaper to make, transport and stock.  This saving applies in spades for houses and cars, and all that they entail.  Instead of 8 ft ceilings, queen sized beds, doors 32" wide, and so on, we could have 5 ft ceilings, and everything else proportionately smaller too.  A 500 sq ft house would serve as well as a 1500 sq ft one does today.  Think of the heating and air conditioning savings that could accrue, not to mention the capital cost of building and buying the house.  Smaller schools, malls and other buildings would multiply this benefit.

In particular, cars could be four feet wide (1.2 m) instead of six or more, and much shorter in length and height as well.  This would reduce their weight by 70% or more, while increasing their gas mileage accordingly.  Trucks could also be smaller since most of the cargo they carry would be reduced in size and weight accordingly.  I suspect that less human and metal momentum would make traffic accidents less serious as well, even without reducing speed limits.  Of course, smaller vehicles would mean narrower roads, smaller bridges, and other infrastructure, all saving additional cost and resources.  Lower taxes maybe? (dream on).

Narrower roads and smaller buildings allow more people to live in the same area, or the same numbers to live in less than 35% of the area for cities and towns.  Smaller cities means quicker commutes, less pollution, and freed-up land for agriculture, recreation or nature.  One can envisage all sorts of benefits that would accrue if humans were about one meter tall on average.  Even space exploration would be easier and less costly, given the cost per kg of placing humans into orbit and maintaining them there.  Little green men, meet little pink men!

With everything human much smaller, unchanged natural environments - trees, plants, animals, rivers, etc. - would seem larger to us, but human activity and interactions would be largely unaffected.  There is little, aside from most sports, that would suffer from making people significantly shorter, and sporting achievements should be easy to reset for smaller folk -- scale down Olymoic records, football fields, etc.  In developed countries, there are few jobs that still require large people, aside from the arbitrary standards for firefighters, military, etc.  Indeed, smaller soldiers would be harder to target, and could fly smaller airplanes, or drive smaller (and cheaper) tanks. Big construction equipment could be just as easily driven by smaller workers, as could farm equipment.

With some joint effort by all nations, coupled with advanced genetic knowhow, we could easily breed ourselves to have smaller stature, and could probably achieve the one meter average within, say six generations, so that by 2200, the goal would be achieved and we could start reaping the benefits noted above.  This would go a long way to addressing the climate change worries that some people have.  In principle, meter high humans, with associated right-sized infrastructure, should used less than a third as much carbon as we full-sized ones, even without other changes.  If we can breed dogs one quarter the size of wolves, reducing the average human height by only 40% should be easy.

This is all tongue-in-cheek, of course, I am not seriously proposing that we try to shrink humans over the next 150 years.  Not only would that proposal be laughed away as a joke, impossible to enact meaningfully, and attract lots of negative responses, but it would raise the ugly prospect of eugenics, the use of genetic manipulation and coercive laws to make humanity "better" according to some elite goal or standard.  Eugenics has had a nasty track record in 20th century Germany, the USA and elsewhere.  If we start breeding smaller humans, why not smarter, or more beautiful ones?  That way leads to a Brave New World that inevitably results in totalitarian measures, and groups of people judged unfit, or of lower importance, by those deemed superior somehow.  I certainly do not want to go there.

So, people as hobbits is an interesting exercise, but it must remain a thought experiment.  We will have to find other ways to limit our footprints on the world we have, and make more efficient use of its limited resources.  Alas, our track record for doing that in the developed world has been less than stellar.  So unless everyone is willing to shrink their stature, we should all try harder to shrink our negative impacts on the planet.  If we all did that, we would not have to wait 200 years to see the benefits.

Tuesday 19 November 2019

Is Science Going Off The Rails?

Science is a human pursuit based on asking intelligent questions, making educated guesses, trying different things, and then seeking to understand what it all means.  The scientific method applied over past centuries has greatly improved life for humans all around the globe and has increased our understanding and control of nature enormously.  I love science and have studied it all my life, but in recent times, the practice of science seems to be going off the rails in some ways.  Science should not be influenced, much less directed by personal power, fame or greed.  It should not be swayed by political or ideological biases.  Yet scientists too are human, and the institutions, research centres and funding agencies are also all too human, so those unwanted influences have been creeping into many aspects of scientific work.

The most well known examples of this are in the so-called soft sciences - sociology, psychology, anthropology - as well as in health sciences such as psychiatry, nutrition, medicine and related specialities.  There have been recent attempts to verify or reproduce published findings in some of these areas, and in many cases - even most in some reports - the published results were found to be wrong or not supported by the evidence.  Then there are the human studies that tend to confirm politically correct assumptions and biases, regardless of the evidence, or where the experiments are skewed to yield preferred results irrespective of the truth.  There has been much soul searching in some quarters, and the number of retracted papers continues to mount.

We all make fun of the nutritional flip-flops from experts on things like the health dangers (or benefits) of wine, salt, animal fats, eggs, coffee, and so on - to the point that few people take such pronouncements seriously now.  Even the gold standard of randomized, double-blind drug studies can come to unwarranted conclusions based on selection bias, corporate pressures, statistical legerdemain and other human foibles, conscious or otherwise.  Doubtful therapies, unnecessary procedures, questionable medication regimes abound, all at great cost.  Such concerns are now a regular component of science publishing.

In the area of psychology and mental health, discounted Freudian theories and social Darwinist influences can still be found.  Then there is the over-medication for any "condition" considered outside some assumption of "normalcy".  Hence we get huge numbers of people on anti-depressants, anti-anxiety drugs, ADHD medications, and so on.  Over-medication has led to the problems of antibiotic resistant diseases, and much of the opiate crisis that is currently devastating many in our society.  Yet the pharmacology industry continues to promote new and often unproven drugs.

Under the spell of political correctness, many scientists and related professions pretend not to know when human life begins - a simple scientific fact understood for more than a century.  More recently, under the fad of transgenderism, we allow doctors to experiment on children with hormones, to mutilate their otherwise healthy bodies, and to leave them infertile and on drugs for life, all supported by pseudo-scientific pronouncements from "gender theory experts".  Meanwhile, we make certain types of counselling illegal because powerful interest groups don't like them and dredge up "studies" to support their agenda, despite evidence that other people want and benefit from them.  The ever-progressive media lap this up, and the confused public is left wondering or silenced by self-censorship on social media.  Yes, some of what passes for science these days can be autocratic, obsessive and quite ugly.

The hard sciences are no longer immune to such improper influences and the need to follow certain ideas and approaches in order to chase further research dollars.  Climatology is unduly enamoured to the "settled science" of climate change, notwithstanding the lack of hard evidence and the mounting strikes against the theories, models and over-hyped alarmist pronouncements.  Regardless of what you think about climate change, it is not good science to overstate selective or uncertain findings while shutting down or denouncing research results from the other side of the issue.  False projections, fear mongering and demonizing your doubters do not promote public respect for the authority of science.  Scientists are supposed to try to disprove their theories, not shore up their preferred hypotheses with filtered data, while ignoring results they don't like.  "Science is faith in doubt", as some have said, but this does not seem to apply to certain favoured theories these days.

Biology in general has been wedded to Darwin's theory of evolution far too long, ignoring or denying counter evidence that continues to pile up.  This faith in an unproven theory-by-extrapolation has led to factual errors such as the myth of "junk DNA" for instance, and epicycle-like attempts to account for all the myriad complexities of life forms.  The Darwinian paradigm is crumbling, but most biologists still fail to recognize the fact, and many go out of their way to censor, dismiss and shut down alternative viewpoints and their findings.

Some branches of chemistry are likewise swayed or even blinded by their deeply held, but unscientific preconceptions.  The obvious example is origin of life (OOL) research.  Not a year goes by without some scientist publishing yet a new "breakthrough" in OOL research, which is then published as if it is just a matter of connecting a few dots to explain how life arose on Earth.  Meanwhile, the truth is quite the opposite: the more we learn about biochemistry and related areas, the harder it becomes to find credible natural pathways from non-life to life.

Even the king of hard sciences, physics, is not without its defects.  At the limit of the very small, theoreticians are looking for ever more esoteric "particles" which they hypothesise existing (such as WIMPS, MACHOs and AXIONs), but for which there is little or no experimental evidence.  There are now pleas for many billions more dollars to build ever larger accelerators to look for such imagined particles.  Meanwhile, whimsical and esoteric "grand unified theories" (or theories of everything) attempt to account for all the known forces, while postulating various new unknowables.  Yet our two most successful theories in physics - quantum mechanics and relativity - are mutually incompatible.

At the other end of the scale range, cosmology is tripping over various murky hypotheses about what makes up 95% of the universe, while pretentiously explaining in great detail what must have happened in the first attoseconds of the big bang, using ungainly theories of "inflation".  Meanwhile "dark matter" and "dark energy" continue to elude capture and even consistent definition.  And recently, to avoid the need to acknowledge that the universe had a definite beginning and the obvious precision fine tuning that goes along with that, some posit an increasing menagerie of unobservable and complicated multiverses, setting aside Occham's razor in favour of elegant mathematics.

Again, don't get me wrong, I love science, and especially physics, but in general, much of the scientific enterprise seems to be wandering beyond reality.  Retractions are on the rise, more and more time is spent publishing the latest marginal findings, and chasing continued funding or the all-important citations in the literature.  Reputation and the reigning paradigm seem to trump actual content, valid conclusions, and credible advances.  Overstatement, emotive appeals and even unfounded hype leak into publications and get blown out of proportion by the eager science media.  Interpretations go beyond the evidence, especially in controversial areas, while unpopular implications of research are denied, ignored or suppressed.

Meanwhile, there is now a whole breed of publicity-seeking scientists making pronouncements way outside their areas of expertise, and the public toss about unfounded statements of what "science says" or "studies show", without even being able to cite their sources.  Much of science has become politicized with favoured viewpoints skewing or sometimes even dictating how experimental results shall be interpreted and reported, even if the conclusions are not supported by the data.

All of this serves science poorly.  Instead of neutrally investigating hypotheses and welcoming alternative perspectives and theories, much of science is now stuck in favoured paradigms and looking mostly to shore up reputation, funding, esteem and legacies.  Thus, science has shifted from the idealized view of Karl Popper, that only falsifiable hypotheses are science, to the more cynical views of Thomas Kuhn, where paradigms of belief are held firm until the old guard dies off.  As Max Planck famously said, "Science advances one funeral at a time".

A lengthy look at the trends in modern scientific studies led one non-scientist observer to write:
"Both natural and social science investigators have exhibited many of the pathologies of modern science, i.e. failing to report negative findings, ignoring counter evidence, relying on correlation rather than on more rigorous test for causation, failing to pursue replication studies that would confirm - or refute - earlier results, misusing data, attributing higher levels of accuracy to their data than warranted, and torturing statistics in order to produce more useful outcomes. Much of this abuse is driven by competition for funding and prestige that characterizes modern academic research, but some of it is driven by ideological preferences."
Michael Hart, Hubris: the Troubling Science, Economics and Politics of Climate Change, pg 565.

Even at Scientific American, blogger John Horgan has unkind things to say about the declining state of science today: the replication crisis, questionable motives in the healthcare industry, the overhyping of incremental research findings, untestable hypotheses touted as science, and so on. Here is another blog saying much the same thing, but in better words than mine above.

I am not a scientist myself, and do not have firm suggestions to improve how science is done.  Fortunately, however, many people are beginning to take note of the problems outlined above and wiser heads have made suggestions on how science can be made fairer, more open and transparent, and less connected to financial and societal success.  Science is important to future human and environmental well-being, and the better scientific establishments are at doing true science, the better the public will accept their results and appreciate the work they do.

By all means everyone should study science, but it is best to do so with a sceptical mind, looking for weasle words, hidden biases, unstated assumptions, and unsubstantiated conclusions.  Consider alternative interpretations and look for critical analyses of published findings.  The deeper you delve into science, the more interesting it gets, even as it remains a messy, all too human pursuit.

Sunday 11 August 2019

Starship Technologies

Forgive me for dabbling in some science fiction here, but for decades on and off, I, like many others, have speculated on what it will take to build and send out an effective Starship; that is, one that can take humans to another star system beyond our solar system in anything like a reasonable time line.  This is, of course, a long term conceptual dream, with negligible chance of coming to fruition in the next 100 tears.  However, I also believe it is not an impossibility, and that, barring human extinction or other catastrophe, mankind will eventually cross the huge voids between stars and begin populating the galaxy.  Bear with me as I conjure up some technologies I feel will be needed for this ultimate human adventure.

First, I am assuming that any starship will take the slow-boat approach to Alpha Centauri, or wherever we decide to send it.  Between now and then, solar-system based technology such as precision telescopes and radio astronomy, will have helped us know more about nearby star systems so that we can choose the one most likely to support some sort of human colonization effort, assuming that is the ultimate goal.  It is probable too that before building a manned starship, we will have sent robotic ones to nearby stars by some method; enough to have explored them somewhat, so that we are not sending people in blind.  That would presumably take several decades, but would also serve as preliminary work for the Starship development itself.

The slow-boat approach assumes no "warp drive", magic gravity-changing techniques, or other speculative ways to achieve non-reactive thrust and thus, sustained high acceleration.  The slow boat is just a continuous push at low acceleration, over a long time (decades), something we can mostly understand today, with no new physics required.  See more below on Starship drive possibilities.

Biotechnology will be one key component for sending people and other life forms to the stars with minimal weight penalty, little life support requirements, and no worries about boredom en route.  We are already developing the means to go from DNA to living lifeforms in the lab.  With further development, it should eventually be possible to reliably reconstitute many different animals and even humans from a DNA library.  Plants can be grown from seeds, and in some cases, animals can be transported as frozen embryos.  We already have that capability for simple lifeforms, and we can store human ova and sperm, or embryos for years.  Future technology may approach artifical-womb status, at least for animals, allowing various species to be regenerated at the far end of the voyage.

The ultimate would be an automated laboratory that can generate any DNA from a data file and then use biochemical and cellular mechanics to grow each lifeform from its particular DNA strand.  Using such an approach, a Starship could carry a significant population of numerous species - even an entire biome - to another star system, with minimal mass/volume and life support costs.  The laboratory I have in mind does not yet exist, of course, it would have to be highly automated, and reliable enough to work properly after decades in space - a major hurdle for something so complex.  Some sort of advanced AI and general purpose robotics would be needed to do most of the work and render the lab operational after the long flight.

To maximize the chance of a successful trip, a Starship will most likely need some live, trained humans at some point during the voyage, to handle unexpected emergencies, or at least to kick start and supervise activities upon arrival.  Therefore it may be good to send a few (perhaps six?) actual human adults along in some sort of yet-to-be-developed cryosleep.  I envisage a technique for greatly slowing human metabolism, supported by blood additives and a variety of chemical drips, cooling down the human body to perhaps 10°C.  The human would then be in a coma, monitored by the robotic AI, and could remain in that state for years with perhaps little aging.

There has been some work in this area, but it obviously needs a lot more research to become practical, especially for reawaking the person at the end of the trip, or if needed earlier.  I assume the robotic AI would be in charge of the Starship through most of its voyage.  Awakening one or more humans only when needed at the end, or perhaps in an emergency.  The human crew would likely be all female so that they can have children (via in-vitro fertilization) in the new world.  Biology is not  destiny, but it helps in this situation.

To minimize travel time, the starship would likely have to get up to somewhat relativistic speeds midway through the voyage. Thus, the people and stored DNA inside would be subjected to increasing, and increasingly directional, ionizing radiation in interstellar space: high energy photons, electrons, neutrons, protons and an occasional atomic nucleus.  To shield the biological "cargo" against radiation damage over decades, heavy water could be used to surround the sensitive humans and other biological specimens.  Of course, the same heavy water could provide the fuel for a fusion drive for the Starship.

Ultimately, of course, the primary limitation on any Starship design is the drive; how to get decades of significant acceleration out of an engine, using less than 100% of the starship mass for fuel.  At present, if I may be so bold, the only significant hope here is fusion power.  Granting that any practical fusion generating station is many decades off (see this for an explanation), the concept of directed fusion exhaust pushing a starship with some reasonable efficiency, is not impossible, at least in principle.  No, we cannot do it today, but maybe in a hundred years we will be able to harness the materials, physics and other technologies needed to do so reliably and sustainably.

The idea is to take tons of heavy water (D2O), break it down, and use the deuterium atoms to fuse into helium nuclei, yielding enough energy to blast the helium, loose electrons, and perhaps the left over oxygen out the back of the ship, at high speed, more or less in one direction.  Fusion energy gives the biggest push per unit mass of fuel that we are likely to see for a very long time.  The starship would be built in orbit and then accelerated at some low but steady rate until halfway through the voyage. It would then turn around and decelerate the rest of the way so that it could arrive at a low enough velocity to successfully enter the target system and orbit a planet there.

What realistic acceleration a fusion drive would yield and what ratio of fuel mass to ship and cargo mass it would need to travel a few light years, is for others to calculate.  They have done so under varying assumptions, and although not a slam dunk obvious success, there are some hopes of eventual good results.  Aside from the technical hurdles of achieving continuous fusion, keeping the torch burning successfully for decades would also require major engineering design work. Nothing in Starship design is easy!

I have mentioned robotic AI above, and that is a more promising area of development.  There have already been major advances in robotics and artificial intelligence, and these will doubtless continue and become more advanced over the next century.  While I doubt we will ever make a human-like AI system, the software will surely advance to the point that most controls and processes can be fully automated.  There would have to be numerous repair mechanisms, contigencies and reset protocols built in, along with considerable redundancy, but we humans already do a lot of that for other purposes.  Fortunately, AI software does not take up much space or power, and robots needs no life support systems.

There would also be major technology requirements at the end of the trip. What needs to be done when the Starship has arrived at its destination and enterred orbit around a selected planet?  The human crew (or cargo) would have to be awakened from their deep sleep, and allowed to recover.  The AI could continue operating the Starship, and could perform much of the lab work and orbital tasks before anyone or anything actually goes down to the planet.  The fusion drive could be shut down or turned way down to provide ongoing electrical power.

Previous, unmanned missions would most likely have occured to test the Starship technology, and to deliver most of the hardware needed by the crew at their destination.  Landing shuttles and their fuel, supplies for the crew, materials for on-orbit construction, research and communications equipment, extra habitat modules, and so on could be delivered in advance of the manned Starship. These earlier missions would also be scouting out the system for potential habitations such as asteroids, moons, or "goldilocks" planets, and collecting long term data to minimize surprises after the ship's arrival.  Thus the crew would have a head start via robotics at the far end, and their ship would only finalize its approach once everything was reported to be in place and ready.  Moreover, they would likely stay on orbit for months in preparation for descent or major construction projects.

After arrival, detailed plans would be made to begin colonization.  The crew, aided by the robots, would prepare needed equipment, likely by cannibalizing the Starship materials, which would have been designed partly for that purpose. The (female) crew would become pregnant (IVF), deliver babies occasionally, with some eventually being males.  If artifical wombs are then possible, they could be put into service instead.  The crew and support robots would grow or synthesize food, perhaps initially using algea and yeast cultures on orbit.  Most supplies can be recycled, and materials could be reused or remodelled into needed equipment or tools, probably using some form of 3D printing.  Similar to the plans afoot at NASA to set up a colony on Mars, materials could be used for a larger habitat.  With a suitable habitat, they could grow plants and start to generate live animals.

Eventually, the colony would want to descend to the surface.  For that, they would need to have studied the planet in detail from orbit, and made plans and contingencies for every scenario.  Probably only a small team, complete with robotics would be sent down initially to get things going, build the infrastructure and test out the processes for living there.  It is unlikely the planet would have abundant life and an oxygen atmosphere, so processes and equipment for life support would have to be made or taken down from orbit.  The planet-bound people would work on developing soil, water supply, power, waste management, and all the myriad other things needed to support a small colony, using mostly materials extracted from the planet.  None of this will be easy, and people will probably die along the way, but none of it is impossible in principle.

Once the initial colonists have shown they can live on the surface, grow food, recycle effectively, and generally make a go of it - possible over a year or more - the remaining people on orbit, and the DNA database and labs could be brought down as needed, and the colony would then grow slowly and carefully.  Whether some crew remain on their now demoted Starship in orbit, or they leave that up to the robotic AI, can be decided at that time. However, the idea is that there should be no need for people to return from the surface back to the Starship, thus simplifying the colony transportation requirements (no huge rockets needed).

Aside from the fusion drive and the futuristic human biotech, the other major hurdle will probably be reliability.  How can we ensure the AI, robotics, and bio-lab will continue to function, or can be self-maintaining over decades in the unforgiving environment of interstellar space?  How could we maintain complex systems without human attention over the same period?  Given the track record of computers, complex factories, and mechanical hardware, even here on Earth, the prospects are daunting.  This calls for careful engineering and probably redundant design at each level, up to and including multiple Starships to same destination.  Having a human crew available during the trip might improve the chances of success, but would entail other requiremnts and raise other problems, such as life support and human sanity over decades in space.

Serious science fiction writers have grappled with these issues and suggested future technologies along these lines for a long time.  As technology advances and our understanding of physical processes and the requirements for space travel improve, the fiction becomes more detailed, and in some cases, more believable.  At some point in the distant future, perhaps a century from now, the science fiction will turn into science reality, and serious engineering can begin to design a real Starship, perhaps using some of the above concepts.  By then, mankind will have learned to live on Mars, and possibly other planets or asteroids, and nearby star systems will have been studied in great detail.  By that point, sending off a Starship to seed the galaxy will not seem so far fetched, but will simply be the next giant leap for mankind.

Thursday 18 July 2019

Philosophy 102 - The Nature of Reality

To summarize my previous Philosophy-101 post, I exist, there is most likely some reality apart from me, and you, dear reader, probably exist as a mind separate from mine.  How's that for a recap?  Now let's explore a bit further into this supposed reality.

Maybe the "reality" as I perceive or experience the sensations coming into my mind bears no resemblance to what is actually outside my mind, in the "real world".  How can I know that I actually have hands and eyes, that what I "see" as I am apparently typing out my thoughts is indeed a computer screen with words on it and a black keyboard, along with opaque walls, transparent windows, etc.  Can I be certain that the noises I hear reflect traffic on the street or music in the next room?  Here too, all I have for certain are the sensations (or perceptions as John Locke calls them) coming into my mind.

As mentioned last time, I could be trapped in The Matrix, with a data stream generated by a super-AI feeding my brain stem.  At a different level, even assuming that these sensations bear some semblance to the "reality" around me, how can I be sure that the "real objects" they seem to represent are actually as they appear to my mind, are stable in time and space, and continue to exist as I see them now, when I am asleep or just away from this room?  Is there really anything behind the wall in front of me?  Is my wife really in the other room watching TV, or is this all some sort of Holodeck program for my confusion and deception?

In this regard, Bertrand Russell made a key observation in 1927: “We do not know enough of the intrinsic character of events outside us to say whether it does or does not differ from that of ‘mental’ events”, whose nature we do know.  He never wavered from this point.  In 1948, he noted that physics simply can’t tell us “whether the physical world is, or is not, different in intrinsic character from the world of mind.”  In 1956, he further remarked that, “we know nothing about the intrinsic quality of physical events except when these are mental events that we directly experience.”  In other words, our mental reality is the only reality we know for sure exists!

However, there is a high degree of coherence and consistency among the various aspects of the stream of sensations entering my mind.  When I reach out my hand and touch my desk, the position of my hand correlates well with the intention I had in moving it; the touch sensation of the hard surface is consistent with what I see and what I remember from the past.  If I tap on the desk, the sounds I experience are consistent, and if I press too hard, the pain I feel fits into the same overall mental picture or model.  When I remove my hand, the appropriate sensations occur on cue and are all consistent.  This applies regardless of what I do.  Even when I make a mistake or stumble, or something unexpected happens quickly, it all fits together, into a real, coherent perfection of virtual reality, if that is all it is!

What's more, in my interactions with the "other people" around me, they seem to be in the same reality as I am.  If I ask someone to pick up that book over there, they seem to hear my words, see the same book as I and understand what I want, and if they are agreeable, they can pick it up.  We both then see the same action occurring, albeit from different perspectives, and if we continued talking about our own sensations about the event, they would agree reasonably well.  Thus, other people seem to have a very similar view of the reality around them.  This suggests that there is indeed something real about that reality.  The two of us, and others as well, can navigate and operate in complex ways around the same objects and spaces, and interact with each other physically as well as verbally in very consistent ways.

The exceptions sometimes offered by philosophers just prove this point.  The stick in the water that appears to be bent due to refraction effects is seen the same way by anyone looking from the same position, and is well understood to be an exception to seeing a straight stick.  The same holds for optical illusions.  The very name says that we understand they are not representative of the true reality but are just artifacts of how our visual apparatus works.  We can usually explain the apparent abnormality in a cogent way, acceptable to most people.  Thus, these "exceptions" do not seriously undermine our models of the reality around us, nor our belief in its true existence and nature.  Indeed, one could say that magicians present stronger evidence against our models of reality when they fool us with their tricks.  But there too, we know they are not messing with reality, even if we don't know how they fool us.

This gets to my primary reason for accepting that our perceptions of the reality around us are fairly accurate, and that reality is, by and large, as we experience it.  The reason is that if there is any reality beyond our minds, then the apparatus we have for perceiving it is simpler if it is fairly accurate.  As an Engineer, I know that simple, semi-linear sensors, transducers and detectors are much easier to design, make and use than ones which send out signals (to our minds) that bear little resemblance to the reality they transduce or sense.  If reality existed, but was significantly different from the way we perceive it in our minds, then the intervening transducers and sensors would have to be extremely sophisticated.  Worse, the coordination among all these errant data streams would have to be continually correlated in complex, high-speed, non-linear ways in order to fool us into interpreting them as simple, yet consistent inputs for a coherent reality.

Evolution (if you believe in it) would surely adopt the approach that an accurate representation of the "real world" is better for survival than an inaccurate one.  If various people can get together and agree on the attributes of say, a table in front of them, and that say, a photo of the table seems to represent the same object, then the simplest hypothesis is surely that the reality is to a large extent as we perceive it to be.  How would evolution proceed to give us such a mental model of reality if that reality was totally different from the model generated in our minds?  Any biological equipment to do this latter would have to be very complicated, and could not arise by a series of simple evolutionary steps.  Of course, this particular argument assumes that we are indeed biological creatures, so perhaps it is merely begging the question.

Obviously we do not perceive reality perfectly or completely.  We cannot see in the dark, or detect radio waves or X-rays, we cannot accurately quantify the sensations we do detect in terms of decibels, brightness, intensity, force, taste, etc.  But most of what we see, touch and hear is most likely a reasonably accurate representation of what is around us; at least the parts of reality important to our continuing existence in the environment we find ourselves.  All people with normal colour vision will agree that this desk is brown, and that if we have only a red light in the room it will, of course, look red.  Imperfect perception does not imply perception that is totally wrong.

Another approach to this question might be to consider a new-born child.  Born into the world with no apparent mental model of reality, each baby must create and assemble her own such model, based on the sensations perceived by her developing mind.  This is a major undertaking, making babies the world's busiest scientists and philosophers!  What are these random blobs of shade and colour that I perceive somehow.  As I grow, I find that they go dark when I close my "eyes" (something that initially happens without my control).  Then I notice that when I move my "head" (also uncontrolled at first), everything shifts, but in a consistent way.  Later, I discover that these things waving in my visual field can be controlled, and eventually learn they are my "hands", which prove to be quite useful as I explore their movement and purpose.  In such a way, a child discovers her world and builds up her mental model of reality.  Of course, none of that is done consciously at first, making it even more likely that it is real.  But here too, perhaps this is just question begging since I am assuming that eyes, head and hands are actually real.

All of this still leaves open the possibility of a very clever virtual reality; that there is nothing apart from our minds and we are operating in a totally deluded state, either self-delusion, or generated by some other entity to deceive us.  For my best shot at getting past that hurdle, see Philosophy-101 again.  If other people truly exist, then we all occupy the same reality, or at least think we do.  A virtual reality capable of fooling billions of people, continually over decades of time, would be beyond anything we could do, and would raise the "why?" question.  What possible purpose would be served?  Yet, we cannot completely discount this possibility.  Nothing in reality can be proved or disproved absolutely!

The final option to reality being real, is that we are all part of some grand simulation somehow.  Like a vast video game, we are software constructs in some immense computer.  This staple of science fiction is taken seriously by many people, and cannot be totally disproved.  However, if this option is true, then the simulation would simply be our true reality, so that, once again, "reality" would be as we perceive it.  After all, even in our taken-for-granted reality, everything we see and do is just atoms and electromagnetic fields interacting, and we don't perceive things at that level.  So whether atoms or bits in a computer, our reality is as we perceive it.

Having attempted to solidify our belief in the reality around us, I should say that there is considerable and growing evidence that, at least at microscopic scales, reality only exists as such when we observe it!  Quantum physics experiments bear this out for photons, electrons, atoms, and even fairly large molecules.  Until we (or some mind) observes the experiment, or the interaction of these particles, the outcome remains a set of probabilities in superposition, that remain unexperienced, and so undefined or even in some sense, unreal.

This finding has serious and troubling implications for philosophy, science and consciousness.  When I close my eyes, does the computer in front of me continue to exist?  Is there really anything behind the wall in front of me if I am not actively thinking about it?  Such questions seem silly to most of us, but they are the sort of thing philosophers delve into.  And more and more, cosmologists and physicists are also considering questions like that.  That mind may be essential to the existence of reality is a serious topic of ongoing study and debate.

Nevertheless, like 99.999% of humanity (assuming such exists), I will go on believing that my mental model of the world around me fairly represents reality, and that the sensations I perceive are good representations of what is going on outside my mind.  I cannot prove that for certain, but little in life is certain, and the assumption surely makes life easier and more interesting.  In any case, these questions are fun to explore and discuss.

Wednesday 26 June 2019

Thoughts on Genesis 1-4

The early chapters of the Biblical book of Genesis are foundational to the Judeo-Christian scriptures, doctrines, and worldview. The first four chapters cover the seven days of creation, Adam and Eve, the Garden of Eden, the Fall from grace, and mankind's first generations in the world. These chapters have been the focus of much discussion over the centuries and I would like to add a few observations and ideas of my own to the mix.

I believe there are two opposite errors in interpreting these chapters.  One is an overly literal reading as a detailed historic account of actual events.  That is a fairly recent fundamentalist way of understanding scripture.  The other is to dismiss it all as a myth that science has "disproved"; the modern or "progressive" way of reading and then ignoring scripture.  Between these two extremes, much has been written about these short chapters of the bible.  They are central to the creation-evolution debates, and of course, they set the starting point for all of theology and Christian doctrine.  I will not repeat what is widely available elsewhere.  In particular, I will not comment on most of Chapter 1, and especially the "seven days" of creation which have been the subject of much argument and speculation.  Rather I have some insights (I hope) and observations that may help shed further understanding on parts of the text, and possibly defuse some of the polarized arguments, at least for people looking for serious study of these first parts of the Bible.

Adam & Eve and the Image of God:
Introduced in Gen 1:26-27, the image of God concept is, I believe, one key to understanding some of the seeming difficulties in Genesis.  Humans (species Homo sapiens) have physical bodies and are alive like any other mammal, but we more than that.  The Bible teaches that we each have a spirit, a divine spark of the divine, setting us apart from other animals.  God himself is entirely spirit; apart from Jesus' incarnation, he has no body, so his "image" is surely also spiritual rather than physical.  Our human spirits from God give us self-awareness, introspection, and other higher consciousness attributes unique to humans; for example, the ability to wonder about morality, mortality, God, the future, the universe, etc.  Our spirits also make us capable of creativity, complex language and social skills, and higher intelligence, among other special attributes setting us apart from other animals.

Thus, one way to read Genesis is that Adam, the first man, was a spirit-infused hominin, thereby transforming a clever ape into a human being having a spirit.  Gen. 2:7 can be read to support this idea.  Hominins were formed from the "dust of the Earth" as God's handiwork in creation, over eons of time.  Once God deemed the species capable of sustaining and enabling a spirit, perhaps around 50k years ago, it was time to create man (Adam).  Thus, there could have been a hominin population in the Pleistocene, from which two healthy, young, representative adults were selected by God who then breathed spirits into them.  Voila, Adam and Eve, who were then separated out and placed in the garden of Eden.  As newly born spirits, they would be naïve of morality (i.e. no knowledge of good and evil), yet able to learn and commune with God their creator, for his good purposes.

Versions of this approach have been suggested by others.  It does justice to the Biblical text, while resolving various issues many have with it.  One intriguing thought is about language.  Adam and Eve may have had rudimentary vocal communication from their hominin past, which God augmented, perhaps aided by their naming all the animals (Gen. 2:20).  The perennial question about who did Cain marry (Gen 4:17) can be easily answered by Cain selecting another hominin female. God then ensured that their children also received spirits.  Similarly, a population of non-spiritual hominins could explain who Cain was afraid of in Gen. 2:14, and how Cain could build a city by himself in Gen. 2:17.  Using his superior intelligence, Cain could train and lead other hominins.  The odd verse in Gen 6:2 about the sons of God choosing spouses from daughters of men may suggest that others of Adam's descendants also married un-spirited hominins they were attracted to.  One effect of the Fall may have been not recognizing whether other hominins carried spirits.  God in his mercy ensured that all offspring of spirited hominins (i.e. humans) would have human spirits.  Thus, over time as the superior humans took over, all Homo sapiens would have spirits.

Original Sin:
This concept of humans as spirit-infused Homo sapiens may also help some better understand the doctrine of original sin.  The Pleistocene hominins did not have spirits.  Thus, they were like other animals, without any moral compass.  No one calls a lion evil when it catches, kills and eats an antelope; that is simply its nature.  Similarly, ethicists and scientists alike balk at the idea that a male chimpanzee killing an unrelated baby chimp is somehow evil.  Humans do not, properly speaking, attribute morality on animals.  Thus, before receiving spirits from God, the hominin group was incapable of sin, even if they beat, cheated, or even killed each other.  That would simply have been part of their "nature" as a species.

Once the original couple was chosen and infused with spirits, however, God gave them some simple moral duties and rules (Gen. 2:15-16).  This, in a way, may be considered being granted "free will", or a moral conscience, as a gift from God.  Even though they did not yet have the "knowledge of good and evil", they were now free to either obey or disobey God.  Tempted by the serpent, they chose (however naïvely) to disobey, and so became the first human sinners, resulting in the Fall as recorded in Chapter 3.  The Bible says that all subsequent humans have inherited this propensity to disobey God and to knowingly sin, from Adam and Eve, as our first parents.  Indeed, original sin is perhaps the one Christian doctrine that is on clear display every day in every culture!

Good, not Perfect:
Some Christians claim that God's initial creation, prior to the Fall was "perfect".  But the text does not actually say that.  Only God is perfect and he himself deemed creation "good", and at the end, "very good".  As creation's engineer (designer) and maker, God knows that no complex system can be deemed "perfect".  Any design can at best, only be "optimal", according to some criteria selected by the designer.  In any design there are tradeoffs among varying and sometimes conflicting design goals and purposes.  Thus no reasonable engineer will claim his design is "perfect", although he may say it is very good; i.e. good enough under the specified constraints and specifications.  The upshot is that Eden was not "perfect", and realistically, seeming imperfections or conflicting purposes could occur among different intents and perspectives within the garden, even before the fall.

Death:
An important consequence of the above is the introduction of death into the world.  Many Bible believers assume there was no death in Eden, and that mortality came into existence after the fall.  I think this is an overly simplistic reading.  First, when God told Adam not to eat the fruit of knowledge of good and evil (Gen. 2:17) and that if he did he would die, Adam must have known what that meant, or else it would have been a meaningless warning.  Adam must have seen plants and/or animals die (perhaps in his pre-spiritual life?), so he had a fair idea what death was.  Moreover, all living involves death on a small scale: the shedding of hair and skin cells, and the daily turnover of blood and intestinal cells.  Adam would not have thought of these as "death", of course, but cell death is death nonetheless.  At a larger scale, Adam would surely have seen insects die, some with short lifespans, and possibly by stepping on or swatting some himself.  Plants too go through cycles of life and death.  Certainly plants would die when eaten as specified in Gen. 1:29-30.  Thus, there must have been some forms of death inside Eden.

More importantly, I believe the death God warned against was spiritual rather than physical.  Adam and Eve did not physically die immediately upon eating the forbidden fruit.  Indeed, Adam lived rather a long time afterward (Gen. 5:5).  However, after the Fall, Adam and Eve lived apart from God.  They avoided him (Gen. 2:8), and except for his initiatives, they lived away from his presence.  As God's image bearers, Adam and Eve had spirits from God.  Spiritual death is usually understood to mean separation from God, which describes what happened to this couple when they were evicted from the garden (Gen 3:23).  Thus, the role of death in the fall ,may be seen as spiritual rather than physical.  Indeed, God banished them partly so that they would not physically live forever by eating the fruit of the tree of life (Gen. 3:22).  It is clear therefore, that Adam at least had the potential of dying physically, even before the Fall.

By the way, the description of the fruit trees and the rules for what they might and must not eat (Gen. 2:9 & 16) suggest that Adam and Eve had not been in Eden very long before the Fall.  They were told not to eat of the one tree, but could eat of any others, including the tree of life.  The fact that they had not yet eaten of that tree (Gen. 3:22 again), suggests that they were still in the early stages of living in Eden.  They must have been there a few days at least (Gen. 2:15-23) since Adam had time to be lonely, name the animals, sleep, tell Eve about the rules, and have some fellowship with both Eve and God.  But if they had been in Eden for many months or years, they would surely have tried every tree's fruit to see what it tasted like, including the tree of life.  One could argue that the tree of life had no fruit for a long time, I suppose, but that would surely not hold for more than a year or two at most.

The serpent's half-truths:
In tempting and tricking Eve (and hence, Adam) in Gen. 3:1-4, the serpent (identified as Satan) was not lying.  Rather, he may have used the ambiguity of "death" noted above, to fool her.  He meant (verse 4) that Eve would not die (physically) upon eating the forbidden fruit, while God likely meant spiritual death in his warning.  Eve, being naïve (without the knowledge of good and evil) took Satan at his word, and overlooked God's rule, which she must have learned from Adam since she was not yet in the picture when God gave Adam the rule.  Being tempted by the serpent's promise of becoming like God, she ate the forbidden fruit and gave some to Adam, who also knowingly ate it.  Moreover, Satan did not lie about the other effects of eating the fruit; "you will be like God, knowing good and evil" (verse 5).  After all, God admits the same in verse 22.  This does not mean, however, that Satan is to be trusted!  The father of lies is most successful when he tells half-truths and twists meanings.

The Garden of Eden:
Genesis 2:8 clearly says that God planted the garden of Eden.  This suggests that the rest of the world was not a garden, and therefore was natural wilderness, complete with the thorns and thistles encountered by Adam after his banishment (Gen 3:18).  This wilderness might, perhaps, include the wild and more dangerous animals as well.  Once evicted from the garden, with no one to look after it (Adam's job in Gen. 2:15), it would have reverted to wilderness and disappeared.  Thus, there is no Eden today, and the angel put there to guard it (Gen. 3:24) could have left after a few decades.

There are doubtless other aspects to discern in these first few chapters of Genesis.  Many books have been written, underscoring the importance of the early parts of Genesis for our understanding of God, people, and the world.  While many dismiss Genesis chapters 1-10 as mere myths for primitive man, the stories and ideas included there form the basis for a significant portion of western civilization and human history.  Even non-believers can gain insights from reading these chapters with an open mind.  The above suggestions and interpretations are not intended to be gospel truth by any means - even I do not necessarily believe them all.  Rather, they are offered here to promote thinking about the Bible and how we might understand Genesis.

Wednesday 15 May 2019

Media Mind Control

"We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of."      E. Bernays, 1928.

Most of us would doubtless be aghast to think that someone else was controlling what and how we think about certain topics.  Consider the following examples however. When you think of "anti-abortion action", what comes to mind?  If you watched TV shows (dramas and news) through the 90's, or read mainstream media reports during the 80's, you would immediately think of abortion clinic bombings and pro-life violence.  Yet there has been very little of that in the past 30 years, and some of the "violence" blamed on pro-life people is actually just peaceful protesting, or, when you look into the details, not caused by them at all.  Indeed, there is more violence perpetrated by pro-choice advocates, but you probably have not heard of such cases because they only rarely get reported in the mainstream media.  You see, the media is, by and large, pro-choice, and that skews their selection of stories and how they report them, or choose not to.  It even skews fictional TV dramas which deal with the subject of abortion.  Those biases have been uncritically absorbed by viewers and readers as a true picture of reality on that subject, and then reflected in their own views.

Next, think of "Roman Catholic priests"; does an image of pedophiles pop into your mind?  Yet the number of priests actually found guilty of pedophilia is miniscule, and other authority figures such as school teachers, scout leaders, sports coaches, etc. have higher percentages of child abuse.  But you would never know that from the mainstream media.  Now think of any issue you do know a lot about and have studied from various perspectives, and ask how well the media presents that issue.  Chances are good that you can see how the media have misrepresented or oversimplified it in some way.  And if it is a controversial subject, you may discern a definite and ongoing bias in their reporting and discussion of it.

Take another subject: "churches and gay rights".  Chances are your mind dredges up some image of the Westboro Church's "God hate fags" group, or some assumed "homophobic hatred" by the Roman Catholic church.  Meanwhile, aside from a few tiny groups, most Christians do not hate LGBT people at all, and most befriend and engage them.  Indeed, collectively, churches do more to help AIDS victims around the world than any other organization, but again, that does not receive widespread media coverage.  Meanwhile, Islamic groups that actually preach (and sometimes practise) violence against gays are largely given a pass in the media.  For some reason Muslims are often also presented as "victims" in the mainstream media.  You might think about why that is so.

These are but a few, some perhaps extreme, examples of where the media-reinforced image is highly skewed by selective reporting and unacknowledged - or worse, conscious - bias.  Most of us just adopt these opinion patterns and story lines, without much thought, assuming that we are getting a reasonable story and, by and large, a fair picture of the truth of the matter.  Unfortunately, in many subject areas, that is simply not correct, or at least it is far too simplistic.

If you are a typical North American, then your views, however vague and inchoate, on most issues of our day, are absorbed from the mainstream media; whether TV news, talk-shows, newspapers, movies and TV dramas, or Internet social media.  This is hardly surprising; since none of us can study or be well informed on more than a few issues, we need to take whatever views are on offer for all the rest unless we want to remain uninformed and un-opinionated.  We may think that we are getting balanced reporting and a fair assessment of the subject, but more and more, that is simply not true.

Most of us choose web sites, TV channels and newspapers with editorial views that reflect our own outlook.  This is natural, who wants to read opinions and news slants that make us uncomfortable, even with our loosely held viewpoints?  However, unless we go out of our way to check alternative views and read articles from opposite perspectives, we will tend to absorb one-sided material that just reinforces our own perspectives and biases.  This sort of "echo chamber" or "silo" approach to opinion building and affirming is actually getting worse in the Internet age.  And with the slow demise of newspapers, journalism is becoming less balanced and investigative.  Opinion pieces become predictable, and "news" becomes mere echoing of others' stories - those the editors or owners find agreeable.

There is a Biblical reference to this human tendency in Proverbs 18:17, "The first to present his case seems right, until another comes forward and questions him." If the other is not allowed to question, or if his speech is muffled or cut short by the judge, then it is not surprising that biased judgements occur and skewed viewpoints propagate.  When the media do the same en-masse, the propagation is thorough and widespread.

Even if your favourite news outlet were truly "neutral" about all issues (unlikely), you would still get a skewed version of reality because of what they choose to report on and what they leave out.  No news outlet can possibly cover everything, so they choose subjects that will attract attention, viewers or readers.  After all, they are in business to make money, which mostly comes from advertising, so the more people who watch or click on them, the better - the "click-bait" phenomenon.  So "man bites dog" takes precedence over "dog bites man" stories, even though the latter is 1000 times more frequent and important.  Watching the news may make you think violent crime or fatal car accidents are on the rise, even if the opposite is true, albeit boring and not newsworthy.  How many people fear to go out at night because they've heard of a mugging across town?  And wouldn't you feel safer if they did not report the event?

But, you may say, the major news media are surely not biased are they?  Alas, it is a true (but rarely acknowledged) fact that the vast majority of journalists are liberal minded, or (in the USA) Democrat leaning in their political views.  Hence the huge hate-on for Donald Trump, for instance, leading up to and continuing after the 2016 presidential election.  And before you think that their approach was reasonable, given Trump's obvious flaws, recall where your "knowledge" about the man comes from!  Can you see that if the media you pay attention to hates Trump and presents only negative reports about him, that might rub off on you?  Once it does, you will tend to read and remember mostly anti-Trump reports, which will only solidify your opinion.  That tendency is just part of human nature.

This human proclivity to consider only data that supports one's personal views is a danger for people on the other side of each issue too.  But given the left-liberal preponderance of  media sources, the impact on public discussion and policy direction setting is largely one-sided.

Moreover, when it comes to social issues, journalists are predominantly liberal or even "progressive". This is not some grand conspiracy, it is simply one result of the nature of the journalism job and the news itself.  "Things are going fine" is not a headline you will often see!  Rather, it is the new, different, novel, and avant-guard that dominate the headlines, the progressive or even transgressive that captures our attention, or our interest in titillation or weird exposé.  This too comes from human nature - boring is just not fun.

Therefore, most journalists take more interest in things that change rather than stay the same; hence liberal rather than conservative.  Throw in the echo-chamber effect of journalism schools, press corps, news bureaus, etc. and it takes a strong character and dedication to seek answers against the flow.  Once oriented to the liberal perspective and viewpoints of those around you, anything different starts to look odd, suspicious or just "wrong" somehow, especially if you don't take the time to examine why you feel that way.  After all, there are deadlines and thousands more words to type in order to earn your keep.  Better just to expound on some text from a news service and put your own gloss on it, based on the perspective you share with your friends and colleagues - a natural, albeit lazy, approach to the job.

A stark example is public broadcasting.  In this country, for example, we have the Canadian Broadcasting Corporation, which is largely publicly funded, and provides news, commentary, opinions, interviews, and entertainment on radio, TV and the Internet.  It is supposed to provide balanced programming and fair coverage of events and views, to fairly represent varied Canadian perspectives and opinions.  However, its coverage is only rarely balanced when it comes to conservative versus progressive values and ideas.  Conservatives and their leaders have complained of heavy, negative CBC bias for decades.  Meanwhile liberals view the CBC is fairly balanced.  Clearly then, the truth is somewhere between - a distinct level of imbalanced coverage and hence, a tilted playing field in the public domain.  Add to that other major networks and social media, and contrary viewpoints become rare on certain subjects, such as transgenderism, abortion, school choice, sex education, or other socially controversial subjects.  As a result, it is not unfair to partially credit the CBC's left-liberal slant for the progressive direction of Canadian society and politics over the past 30 years or so.

If the mainstream media was just treated as unimportant fluff, all this would not matter much, but almost everyone gets their opinions and views on most subjects from either like-minded people around them or from the media.  Authorities in high places absorb their perspectives in the same way: teachers, corporate executives, judges, politicians, administrators and bureaucrats are all busy people and cannot delve into most topics, so they absorb their views by osmosis from those around them, and ultimately from the media sources they go to every day for the news and commentary.  These views are then consciously or unconsciously fed into their worldview, their work, and its results.  When those results are deemed newsworthy, they are picked up by the same media and reflected back to us, filtered through their same liberal biases.  Hence a positive feedback cycle that builds up currently "politically correct" viewpoints and ignores, dismisses, derides, or even demonizes opposing perspectives.  In the field of engineering, positive feedback usually leads to instability or an extreme, stuck output: not a good way to adjust and control the system or the reality around us!

Whether you accept it or not, the mainstream media determines to a large extent, what and how you think about most news items and issues of the day!  The next time you watch a news item, note the issue presented and your own views on that subject, especially if it is a controversial topic that you are not directly involved with.  Where did your views on that topic come from?  How did you formulate them?  Why do you hold them, however loosely that may be?  Did you actually research all "sides" of the subject from multiple sources, or have you just absorbed your views from the same media channels over the past number of days, weeks, or even years?  If your views are based on what you have seen on TV, or what your like-thinking friends and family have watched or read, then clearly, the media is generating much of the content of your present perspective, whether you are aware or not.  So in some sense, the media is indeed controlling what you think about and how you think about it - in short, mind control!

This is not as obvious or intentional as say, Big Brother in the novel 1984, or similar dystopian fictions, but it is no less real.  If you still doubt this, look at how the general public view of certain subjects has changed over the years. There are several subjects on which public opinion has flipped, sometimes in a short period.  Think of views about the Vietnam war, marijuana use, gay rights, euthanasia, the crusades, climate change, transgenderism, etc.  How has the public view been skewed by selective media reporting and opinions, as opposed to fair and careful analysis, and balanced weighing of all sides from a truly neutral perspective?  Once a particular view becomes "politically correct", few media outlets will speak or write against it.  And how it became PC in the first place can usually be traced to media-enabled progressivism - an avant-guard view picked up as exciting news stories, complete with presentable "victims".

Ask yourself, how did these views make a 180 degree change?  Did each citizen sit down with a wide range of evidence from every aspect, read through it and come to well-thought-out position on each issue, thereby changing their attitude and view?  Of course not.  Rather, we each absorbed what we heard or saw in the media, as presented by journalists, based on interviews and selected polling data, or historical narratives and special interest group inputs.  Perhaps there was some scientific and factual evidence, and not all of these changes were unfounded, or for the worse, but many societal shifts occur for largely ideological reasons when special interest groups are smart enough to attract the media to their point of view.  In that way, the media, without a clear plan or stated direction, have changed our thinking on these and many other subjects.  They will doubtless continue to do so, and we can expect to absorb some future flip flops on issues we currently hold, again based on how the media presents the changing perspectives they use or have learned to express.

But you may counter, the media is just reflecting public opinion shifts on these subjects.  That is easy to say, but where do those shifts come from if not through the media and their selective promotion, or demotion of one or other viewpoint.  We did not get gay marriage, doctor assisted suicide, legal marijuana, or abortion on demand by public study of all the relevant evidence followed by fair plebiscites for approval, and wisely crafted legislation.  Rather, these perspectives were being pushed by selective media interviews, biased stories as "evidence", and one-sided opinion pieces over the years, topped off by liberal legislation, or worse, Supreme Court fiats, based on their personal views absorbed from the same media.  Indeed, actual majority public views on controversial subjects are often negated by progressive edicts from the courts or "forward thinking" governments, claiming to be on the "right side of history".

So what can one do?  Of course, you still cannot thoroughly research every issue and news item to get a balanced view of the truth or a fair representation of reality.  However, just being aware of the role the media plays in setting your opinions and views, can forearm you against obvious bias, your own tendency to silo with like-minded people, and the high probability that there are reasonable views on these subjects counter to whatever you are reading or hearing at that moment.  Watch for biased interviews, loaded questions, pejorative or supportive adjectives, selective or one-sided presentations, lacking or simplistic "evidence", and other ways of skewing a story line.  Then you can be less dogmatically opinionated, and perhaps more open to looking at and listening to alternative perspectives.

Indeed, if you want to avoid others doing your thinking for you, then check out alternative news and views on the Internet or different TV channels.  They may seem weird and doubtful at first, but just think, the people who silo with those sources, probably think the same about the sources you prefer to frequent!  This goes for people on both sides of the political spectrum or any particular issue!  There is nothing like an opposite perspective to get you to dig into and really think about your own views on any subject.  In the process you may become more tolerant of others' views on certain subjects you initially thought were slam-dunk obvious.  You might also become better informed and perhaps wiser for the effort!  At the very least, try to hold most opinions you have less firmly, and remain open to alternative views, until you have thoroughly looked into all perspectives.

"The structures of a [managerial-therapeutic] regime are usually able to exercise power in a “soft” fashion. These consist mainly of: the mass media (in their main aspects of promotion of consumerism and the pop-culture, not to mention the shaping of social and political reality through the purveying of news)"      (Mark Wegierski, 2016)

"The first moral obligation is to think clearly." Blaise Pascal.

“If you don’t read the newspapers you are uninformed — if you do read the newspapers you are misinformed.” Mark Twain

Sunday 14 April 2019

Futile Technological Dreams

Some scientific ventures and technological projects should stop pretending that there will be a big breakthrough that will change everything. After decades of unfulfilled promises and lack of significant progress, some research areas should be mothballed, or at least scaled back so that the money and effort can be applied elsewhere. Here are some currently expensive science and technology projects that, in my humble engineering opinion, will not pan out as long hoped, notwithstanding all the public hype surrounding them:

Practical Fusion Energy:
The joke for the past 50 years or more has been that fusion power generation is always just 20 years away. It would be great if fusion power became a reality, and some of the research in that area is fascinating, but I think it is safe to say we will not have a practical (much less financially viable) fusion generating plant in this century. Even if the science hurdles could be overcome - and they are still huge - the technology is so esoteric, marginal and complex that building a fusion reactor to operate most of the time for years is way beyond anything conceivable today.

Fusion research focuses in two areas: magnetic and inertial confinement. Magnetic confinement is the original hope, with strong magnetic fields in some three-dimensional configuration, holding a very-high temperature plasma with a high enough density, long enough for tritium and deuterium nuclei to fuse together into helium nuclei.  This reaction releases some neutrons and lots of energy, which when thermalized, can drive a normal turbine/generator for electrical power.

The basic problem is trying to maintain the temperature and density high enough to produce more output energy than went into getting the reaction started. The plasma leaks out of the magnetic fields, the strong fields become unstable, and the peak conditions quickly fade. The newest fusion reactors have to use all sorts of special effects and added technology to achieve even momentary energy break-even. And then the system has to be shut down, and refurbished, or at least cleaned up and re-initialized for the next experimental run. Unwanted reaction products pollute the carefully assembled gas cloud. High energy particle irradiate and damage the interior walls of the chamber, and all the external high-power equipment is pushing the envelope of what is technically possible, so is difficult to keep in operation and expensive to maintain.

It seems the best promise is to make the reactor much larger, but that brings its own technical problems, not to mention costs. Slight improvements in materials and better understanding of the physics are unlikely to achieve the major advances needed for a commercially successful fusion station, much less the "energy so cheap, it won't be worth billing for" as originally dreamed.

Inertial confinement schemes suffer even worse problems. In this approach, a small target sphere with metal layers over the reacting Du and Tr atoms inside is carefully injected into the reaction chamber, and then blasted at a precise moment with focused, high-power laser beams on all sides. If everything is perfect, the metal shell is vaporized and blows off, pressing strongly against the reaction gases inside, compressing them enough for a few microseconds, allowing some of their nuclei to fuse together, releasing a lot of energy, which then blasts apart the gas, ending the reaction. Each target is small and expensive to make, and yields only small amounts of energy in a single burst. The hope is that the targets can be made cheaply, and injected and pulsed regularly, to provide continuing pulses of thermal energy, and hence, generate more power than is needed to operate the lasers. Experimental systems have apparently been able to achieve energy balance for single-shot tests.

The lasers involved are huge, very expensive, and difficult to keep focused and operational. Their lenses tend to heat up, distorting their shape and focus. The power supplies needed to drive the lasers are huge, and coordinating multiple laser pulses is tricky, even for a single pulse. The target pellets have multiple ablative layers which need to be precise in order to achieve uniform compression. Keeping the system, especially all the optics, clean with the exploding pellets and reaction products is a major hurdle for continuing operation. And then there is getting the released energy out of the reactor efficiently enough to make the power generation worthwhile, especially since much of the system needs to be cooled for proper functioning. As with magnetic confinement, it will be a steep uphill battle to solve all the engineering problems needed for continuous commercial operation.

Don't get me wrong, I think fusion research should continue. We will learn more about the physics and the techniques for controlling plasmas, the management of microscopic fusion explosions, and perhaps more important, the technologies developed to push beyond the current boundaries of materials, manufacturing, control mechanisms, and human understanding. But do not pursue this work hoping to build a practical fusion generating station in the foreseeable future.

As a reality check, compare how easy it was to get a stable nuclear fission "atomic pile" working in the 1940's with the troubles nuclear reactors have over the long term today: leaks, cracks, fuel handling and processing, redundant safety devices, human error, waste materials, decommissioning, etc. Given the experimental technological difficulties of sustained fusion, the construction and operating headaches of building and using such a reactor would be an order of magnitude more expensive and difficult. What power company is going to want to invest tens of billions in such a doubtful venture? Perhaps in a hundred years with new technology, materials and know-how, such a power plant may become feasible, but even then, will it be practical?

AI Consciousness:
There has been a lot of hype about artificial intelligence in the past decade or so: deep learning, Jeopardy-winning Watson, game playing champions, the dream of uploading our consciousness, the search for an artifical general intelligence, attempts to model animals' brains, and of course, the singularity, in which AI becomes smarter than humans, evolves itself and eventually takes over the world, the better or worse for mere humans.  It is true there have been impressive advances: self-driving cars, natural-language capture, expert systems, big-data mining, and so on. But none of this comes anywhere near having a self-aware, conscious AI, notwithstanding all the sci-fi movies and various futuristic warnings or promises in the news and media lately.

Any AI system is basically, a complex algorithm, using inputs and information from its memory, to do calculations, make decisions, draw conclusions, and produce outputs, all based on its programming.  Yes, some programs can adjust themselves to improve their game playing, or incorporate new data or goals set for it, but they do not "choose" their own goals on their own unless they have higher-level goals preprogrammed.  They do not think for themselves, unless you want to include what AI does as "thinking".  Unlike humans, they have no subjective sense of self, purpose, or meaning.  A chess playing program does not "know" that it is playing chess.  Sure, it may be able to answer questions about chess and tell you it is playing that game, but it is only doing so because of its programming, done by clever human minds.

No computer can "understand" the "meaning" of a poem or painting.  It may be able to parse the text and tell you what the poem is about, but it cannot think about the poem and what it means to itself, you or me.  Any computerized "emotions" are merely simulated in response to specified inputs.  No AI can have "intentions" aside from optimizing some parameter specified by its programmers.  Thus, all the apparent intelligence and seeming autonomy are narrow and programmed into it by humans, who do know what they are trying to achieve.

Don't get me wrong, AI has an amazing and huge future: medical diagnosis, human assistance, technology management, scientific research, etc. are all being helped by AI today, and that will only increase in the future. We will eventually get truly self-driving cars (although not as soon as some people project). We may get AI systems that can diagnose diseases better than human doctors. And so on. This work will continue and I hope it continues to benefit mankind in realistic ways.

It is quite possible that at some point in the near future, someone's clever AI system will be able to pass a fair Turing test, but that will not make it conscious, the equivalent of a human, much less of superior intelligence. Such a machine will have been programmed by humans to emulate a human well enough to fool other humans, but that won't make it human-like in any true sense. Humans are more general and flexible than any computer, and humans have self awareness, subjective intentionality, and the whole broad set of capabilities. They understand what they are doing!

So do not worry about AI's taking over the world or usurping humans. There is more danger from (human) hackers taking over the AI to crash a self-driving car, or damage a nuclear reactor's controls. Or some terrorist organisation getting hold of a smart weapon and sending it into the White House. It is true that some people will lose their jobs to AI systems, but that will happens slowly and people are usually flexible enough to find other employment niches. So AI will be disruptive, but not destructive, nor the end of the world as we know it. The singularity, uploading your consciousness, sentient androids, equal rights for AI's are all science fiction staples, but they will not happen in your lifetime, and perhaps never.

Quantum Computers:
There is a lot of research going on trying to develop a useful quantum computer. This is a device or system where the "bits" in a standard binary computer (all thjose 1's and 0's) are traded for "qubits". A qubit has the quantum property of being is an undefined state, or a "superposition" of being both a "1" and a "0" simultaneously. That sounds weird, as is pretty much everything about quantum physics, but it is perfectly true. So far, no problem.

In principle, with enough qubits all working together ("entangled" in the quantum jargon) such a computer could solve certain types of problems and simulate physical systems much faster than today's computers. Algorithms for quantum computers have been developed to take advantage of this capability, and have been shown to work on systems of a few qubits. Thus, the principle of quantum computing is sound.

The major hurdle to the development of powerful quantum computers is the number of qubits that can be entangled and maintained for the duration of the algorithm processing. It is relatively easy to get a large number of atoms entangled, but those are not qubits. To be useful, the qubits have to be maintained individually, programmed individually and properly, allowed to process the desired algorithm, and then read-out at the end of the run. The trick is to do that while keeping them all entangled long enough so they can all work on the algorithm simultaneously. In all such systems so far, "decoherence" sets in after a brief time, as the qubits begin to lose their entanglement. At that point - usually a few microseconds - errors creep into the processing so that the results quickly decay into nonsense.

In the initial years of development, various ways to generate and maintain qubits were researched. Then various ways to get several qubits working together were developed. It became possible to keep up to perhaps ten or a dozen qubits entangled and accessible long enough for testing simple quantum algorithms successfully. However, any useful quantum computer - "useful" meaning sufficiently faster than a normal computer to make the cost and effort worthwhile - would have to use hundreds of qubits, and preferably thousands, and that has proven to be a huge challenge.

Quantum computers boasting ten or so qubits are available for research purposes today. There are claims of systems having 40 or more qubits, but those are questioned by many people. Any system with 40 qubits that cannot get them all entangled together, carefully programmed, and kept in that state long enough to perform useful processing is not very helpful. I am not an expert on this, of course, but from what I have read, every added qubit makes the system much more difficult to set up, more susceptible to noise and errors, and reduces the duration of the entanglement. As a result, there has not been much advance in the past few years.

Based on this pessimistic outlook, I do not expect useful quantum computers to be ready for sale any time soon, and probably not for a long time, if ever. Here too, the research and technology are fascinating and should continue, but we shouldn't base our support for it on hyped up promises. I am not alone in my doubt, a recent IEEE Spectrum article described some additional concerns about the feasibility of quantum computing as the number of qubits increases.

Origin-of-Life Research:
This one is somewhat different in that the research does not aim to produce any new products or commercially useful processes, although if successful, some would doubtless flow from the results. Rather, the purpose here is to explore how life might have got started on Earth some 3.8 billion years ago or so, and as an added benefit, to see what conditions and processes could possibly cause life to arise on other worlds. This is called abiogenesis - life arising from non-life.

These are valid research projects, but there is another motive; that is to show that undirected natural processes could bring forth simple life forms, thereby undermining Intelligent Design theory and the need to posit supernatural creation help or intelligent guidance. As Richard Lewontin, an atheist scientist remarked, "materialism is absolute, for we cannot allow a divine foot in the door." Thus, some scientists are consumed with seeking only materialist causes, however hypothetical or unlikely.

And unlikely they are. Charles Darwin speculated about life beginning in a "warm little pond" somewhere on Earth, but the biomolecular science needed to define what that meant only came about a hundred years later with research into proteins, DNA, the genetic code and other aspects of living cells and organisms. There have been numerous hypotheses forwarded about how life might have got started: thermal cycling in warm ponds, non-equilibrium chemistry around sea-floor vents, surface catalysis on clays, organic molecules deposited by comets, lightning strikes in the atmosphere, even panspermia (seeding from outer space). It seems any superficially plausible speculation is good enough to get published, hyped up for the public, and even funded for further research.

The detailed chemistry aspects of all these ideas, however, involve immense hurdles, even with the probabilistic resources available over the surface of the Earth and hundreds of millions of years. Trying to find credible chemical pathways from simple organic molecules to complex biology capable of self-reproduction is extremely difficult, and so far has been impossible, even under carefully controlled laboratory experiments, with expert help in selecting pure chemicals, exacting conditions, purifying intermediate steps, preventing unwanted reactions, and then changing everything for the next step. To serioualy suggest this all happening in a random dilute mix of molecules, with no plan or guidance, beggars the imagination.

Some of the many difficulties are the fact that the building blocks of life are difficult to construct in mixed-chemistry situations, and tend to degrade or agglommerate in the wrong way once they are created. Getting a nice mix of nucleotides or amino acids together  under any realistic early-Earth conditions is essentially impossible. Getting them to then polymerize properly is another massive hurdle. All of life uses left-handed molecules, but non-biological processes tend to produce both left and right handed molecules at random, which are then difficult to separate.

The simplest possible cell needs more than the building blocks. Any realistic cell needs proteins to operate, specific sugars to consume for energy, various lipids for the cell wall, other carbohydrates, and complex enzymes to make the biochemical processes possible. Any living system uses DNA and other codes to specify these complex items, as well as pre-existing molecular machines (proteins) to make them and do the actual work. How any of this could have come about naturally anywhere is all but incredible. How all of it could have come about in the same place at the same time, by unguided, natural processes is essentially impossible.

Ah, but you will say, there was an intermediate step using RNA molecules as both the early genetic code, and the functional aparatus to do the work. Thus was born the RNA world hypothesis, wherein RNA somehow came about and was active enough to not only operate as a simple life form, but also store information and replicate itself, allowing Darwinian evolution to get going. This overlooks the need for lipids and sugars, but sounds interesting. It is presumed that the DNA code storage and the protein synthesis mechanisms evolved later via the process of natural selection.

Aside from the problem of supplying the lipids and sugars, and the undefined Darwinian magic, not to mention the chirality (left-right chemistry) issue, RNA molecules do not spontaneously come about by non-biological means, much less come together in long polymer chains. Even under controlled lab conditions (intelligent design here!), the best active RNA molecule that scientists could design and create, so far as I am aware, is one that cuts itself in two pieces; not exactly a promising start. More recently, a laboratory created RNA strand that can catalyse its own "reproduction" from other carefully produced smaller strands, under precise lab conditions, has apparently been demonstrated. The gap between these experiments and the creation of an "RNA world" under naturally occurring conditions is immense.

How any RNA protocell would develop the transcription hardware needed for DNA before needing DNA, or how it would make DNA before being able to use it has never been elucidated. A lot of intracellular machinery is needed for the DNA-RNA-protein synthesis process to work, so it would all be needed at the same time since unguided processes, even Darwinian ones, cannot foresee a future need for specific complex chemistries.

Over and above all of this is the information problem.  Even the simplest life form has megabits of functionally precise DNA code, which is needed to specify all of its hundreds of proteins and to control its cellular processes. In our uniform experience, huge amounts of meaningful information only arises from an intelligence, or a machine designed and built by an intelligence. There is no credible natural mechanism that can generate such volumes of genetic code, much less usefully insert it into a complex chemical system. In theory (but questioned in practice), the Darwinian mechanism can add a bit or two at a time, but that cannot account for the megabits of code needed before natural selection has something to work on - a self sustaining, self replicating protocell.

Every effort to produce life from scratch, even under lab conditions, with scientific guidance and control, have fallen far short of the goal. With each new finding in molecular biology, the complexity increases, and new hurdles to abiogenesis are revealed.  Thus, I do not expect that humans will ever find a credible natural route to making life.  I am not an expert here, but other, more knowledgeable chemists agree.  Such research efforts should probably continue as new pathways and processes may be revealed, but the prospects are not promising, notwithstanding the hype in popular magazines. Perhaps research should expand to explore how much functional information is needed, and allow for the possibility that intelligence provided that information, along with the inital biochemical tools needed to interpret and use the information.