Category Archives: Ah, Sophia!

Contemporary philosophical musings and their various relatives.

Road Map to Ultimate Reality?

David Deutsch has plotted out a course toward understanding ultimate reality.  His map includes four courses in fact.  Here they are:

Theory of Evolution

Quantum Physics

Theory of Knowledge

Theory of Computation

He is endeavoring to move toward a new way of looking at knowing and explaining.

Here is a brief introduction to Constructor Theory of Information.

Constructor Theory

And here is Dr Chiara Marletto discussing this same subject in an interview.

Beautiful, smart, and driven. But I digress.

Get out there and get thinking!

9 July:  Found this clip which is relevant here.

Reason, baby!

JamesIsIn

Contemplating Dreadful Experiences

We often forget that we are not the one who commits suicide but only the recipients of realizations relating to the other’s decision to leave us.  It’s easy to forget.

The media, as a general rule, does not report on suicides.  The reason for this is that when the media reports on suicide there is a corresponding uptick in the suicide rate.  We might think of this as a sort of permissions slip passed around the news rooms and living rooms of this Earth.  However, when the person who commits suicide is a celebrity there is little avoiding that reporting:  we all want to know what has happened, the consequences be damned!

Last night one of my all-time favorite bands lost a singer and friend.  Let us take a moment.

That angelic voice, you will note is silent.  This is the future echoed from the past.

Please take some time out to say hello to your old friends.  They may appreciate hearing from you.

Good-bye, Chris.

JamesIsIn

Autism and the Brain EQ

Here is a great article discussing one particular mental ordering (autism) and how diagnosis changes and does not change lives.  I find this story to be compelling because it lays bare the defining mental orders and how mental ordering defines individuals.

When We Realized My Husband Has Autism

My usual metaphor for brain traits is an enormous equalizer loaded with perhaps billions of sliders.  You could conceptualize these sliders as broad categories like autism or obsessive compulsive disorder, but it may be more appropriate to scale the sliders as individual traits like those which make up the diagnostic criteria for the various mental states we have so-far defined.

In any given individual a particular slider may be higher or lower depending on various contributing factors.  It would be somewhat rare to locate someone who had all the sliders for a particular concept (autism or even masculinity) pushed all the way to the top (or pulled all the way down), but we should expect that for any given individual few sliders would be resting right in the middle.

(This model should work for genetic expression as well.)

Food for thought.

JamesIsIn

The Source of Bertrand Russell’s Famous Tea Pot Argument

I really like Bertrand Russell and so should you.  He’s one of the last great intellectuals.  Sure he’s British but he’d hold his own in any circle of American pragmatism.  He’s got brainy gumption and moxy.

Probably he is most famous for his tea pot argument.  Essentially it stakes out the notion that a lack of evidence disproving something cannot be taken as sufficient cause to believe in the un-disproved whatever-it-is.

Finally, this argument appears in a paper Russell wrote on commission from Illustrated magazine (1952).  This paper was never actually published by the magazine, but it is probably available in a collection somewhere.  (This one looks like a good collection but I don’t see the article in question.)

You can find the text here and here (the second link provides a couple of interesting footnotes).

It’s a short article and well worth reading.  I hope you enjoy Russell too.

Go get ’em.

JamesIsIn

Happiness Just Might Be a Warm Cup of Coffee

Rocky Raccoon may have been satisfied reading that book placed by the Gideons in his hotel room, but I have something better for your reading list.  I read this book  a while back and am just now reminded to say something about it with all this hubbub about Starbucks and guns.

The book (by John R. Lott, Jr.) is called More Guns Less Crime (Understanding Crime and Gun Control Laws).  It is the most comprehensive analysis of crime statistics I have ever seen.

More Guns Less Crime
More Guns Less Crime

The gist of his argument is that when a certain kind of liberal concealed carry law is implemented there will be an associated reduction in the rates of violent crimes (both locally and in neighboring areas).  The statistics seem to uphold this theory and provide especial insight into the relationships between these same laws and the protections afforded to women and minorities.

I think folks on either side of this issue (as well as anyone on the fence) will benefit from reading this book.  I make no bones: he is writing (even if from the compulsion of reason) in support of the laws he finds protect us best.  Whether you are swayed by the power of reason is up to you, but you will find much within the pages to respect.

Of course the Starbucks issue is really a non-issue.  It is both opposing groups attempting to get a corporation to sponsor their petty debate.  This is not an issue for a corporate board room decision.  This is an issue for legislation.

I think we have great legislation in Washington state (very much in line with what Lott suggests provides the safest social sphere), so I’m not going to get all up in arms (what?) if cowboys start spinning their spurs while waiting on their capuccini.

I’d rather see Starbucks fix their grammar in whatever language they are under the impression they use.

Lock and load, baby-doll.

JamesIsIn

Storing Errors in the Brain and Creative Sparks

It occurs to me that there could be a connection between those mistakes we make, which we store as truths for even small fractions of time, and the later ability to create or innovate.  As I can at this time conceive of no scientific proof by which this speculation may be either confirmed or denied, it must remain solidly in the realm of philosophy.  First I will outline what I mean and then I will look at the possible consequences should this be true.

The brain stores memories, experiences, and the like as synaptical connections within its neural networks.  Some of these stores data points we can think of as true and others false.  Consider a child who encounters a small furry four-legged animal.  They may, adorably, call that animal a dog.  This experience will become part of their neural matrix relating to dogs and fur and animals &c—complete with many storage points, interrelations, and connections.  When later that child discovers this animal was and is in fact a badger, adjustments will be made across the matrix to allow the child to correctly distinguish between the true dog and the true badger.  However, what is important here is the false badger.

For the duration in which the badger was falsely identified as a dog, pathways were formed and connections were created all of which can be called upon later.  Clearly the person in question will not want to falsely identify the badger as a dog, but these pathways and connections provide alternative routes of thinking along which new ideas and innovations might be prospected.

Consider next that any brain, young or old, will contain thousands upon thousands of these false truths.  They may have been briefly held or long-standing, what matters is that what is false was thought true.  This branching allows for later branching.  It seems to me that this later branching can assist in the creative and innovative processes—branching into new ideas heretofore unthought.

I do not claim that this is the only nor the necessary cause of the creative spark, merely that it can be considered as a partner in the innovative process.

If this is the case, then we might surmise that any brain which was not capable of allowing for false truths of some minimal duration would also be in capable of exploiting those alternate pathways for creation and innovation.  Clearly this would have an impact in the field of Artificial Intelligence.  Such an artificial brain must be able to store indefinitely and use at least occasionally information which is incorrect.

My understanding of the current breed of AI decision making networks is that they are able to learn by adjusting weights over time.  These weights do not store the incorrect pathways; rather, they replace them.  This may well yield positive results in creating brain-like computing devices; however, it may be necessary to allow these artificial brains to maintain databases of false truths, mistakes, and incorrect impulses if we are to see them create and innovate.

This is speculative and probably highly condemnable by stronger philosophers than myself, but here it is anyway, out in the theater of ideas ripe for your consideration.

The Power of Misinformation

Nobody wants to kill their own children.  I mean, that may seem like a good solution at tantrum time, but all kidding aside parents for the most part really want to see their children survive them.

However, the desire to feel good about protecting your children can lead down a path where feelings outweigh reasoned arguments.  Thanks to my friend Eric for sending me this great article on one facet of the crisis in this country concerning the irrational, wish-dream advocates attack on all intellectual and rational pursuits.

This article at Wired (“An Epidemic of Fear: How Panicked Parents Skipping Shots Endangers Us All”) does a good job of summing up the current information about immunization and the alarming trend to ignore the body of science supporting it.  Definitely a good read.

What is interesting to me (and to a number of friends with whom I have discussed it) is this willful embrace of ignorance.  It’s not just present here in this immunization issue.  Anytime truth comes into conflict with emotion there will arise a faction who cling to untruth for the sake of the heart-strings.

While I am certainly capable of sympathy with those many positions which feeling leads us towards, an important part of growing up is recognizing that the world is rarely as we wish it.

It’s time to grow up, everybody.

Though it is likely true what this article posits in its final paragraph: “There will always be more illogic and confusion than science can fend off.”  Nonetheless, we can and should raise our rational voices against the gale of emotive blabbering.

It is no longer enough to rest assured that the truth will prevail in time.  Yes, the Catholic Church did finally pardon Galileo.  But he died blind and separated from his daughters under house arrest in Rome.

Raise up your rational voices.

Time: Unreal or Surreal?

Recently I read A World without Time: the Forgotten Legacy of Gödel and Einstein by Palle Yourgrau.  He argues (at least in part) something that I’ve often contemplated, namely that time itself is not real.

I have put forward in conversation the idea that time is not real, that time is an illusion.  “But, wait,” you say, “time is whizzing past me like crazy!”  Well, events are whizzing all around us.  No doubt about that.  But time is merely the framework which we use to explain our ideas of now and later; of past, present and future; of what was, what is, and what will be.  Time is our mental construct to explain the passing of events.

Yourgrau’s contribution is to analyze the work of Einstein and Gödel in an effort to demonstrate not only that they both thought something a little different about time than we might expect, but that they each went a long way toward demonstrating a particular unreality of time.

Simply put, Einstein only stated that time needed to be considered as if it were a dimension in order to work out relativistic events.  It’s sort of like using time as a metaphor (or perhaps using dimension as a metaphor).  It is now common to take that metaphor as fact and fabric of reality.  Gödel comes in handily afterwards in his meticulous fashion and demonstrates that not only are time-looped relativistic universes possible, we may actually be living in one such universe—accidentally spawning the so-called grandfather paradox.

The book is loaded with personal correspondences between some of the brightest minds in the last century.

I have read a lot of books on the history of the sciences and mathematics and this is one of the better written, and I think anyone with even a peripheral interest in these subjects will find this to be a pleasant read.

English Won, Oh, Won!

Since the inception of the Internet many great things have to come to us dancing humans.  It has been a great boon for humanity—intellectually, informationally, pseudo-sexually.

One unintended consequence though has been the unleashing of a host of language butchers.

Mail became e-mail and though we will never see anyone with the slightest grasp of the English language say “the postal carrier brought me three mails today” somehow even the brightest among us will whip out “I got e-mails from my mom and my brother today”.  E-mails sounds like something naughty.  Where are these e-males coming from and do they also have e-females?

I do occasionally chuckle during my chat conversations.  When I wish to indicate to my interlocutor that I have been moved to spontaneous giggles I use the old stand-by “hahaha” or some deviation therefrom.  I am loath to type some such abominable abbreviation such as smomnilsfh—spitting milk out my nose I’m laughing so fucking hard.  Perhaps this is because I can touch type.  Perhaps not.

And what’s the deal with using two periods in a row? I had a friend explain to me that when he did it he wanted less of a pause than an ellipse (three consecutive periods). I asked him how long the pause of an ellipse was. He had no answer. Use an ellipse or a period.

I admit that I had these bad attitudes concerning spoken English long before the Internet showed up.  I reflect with fondness on once hearing my friend say that something or other was across from some other something and my asking him whether he spelled that acrost or a-crossed.  Yet this pronunciation lives on.

As does pronouncing of height as though it contained a second h—heighth.  So when I hear a native English speaker criticizing some immigrant struggling with English as a second or even third language, I shake my head in awe.  “When you have yours in order, then we’ll talk,” I chide them.

Which nicely brings us to one of the most pervasive butcherings to date.  There is a certain alleged Internet provider who will remain nameless—but who is easily identified by the copious discarded CD’s offering hour upon free hour of alleged Internet service—this filth monger thrust upon our beloved Earth a phrase vile and now disastrously ubiquitous: you’ve got mail.  Have got?  What is the sense of this misconjugated compound verb?  You have mail.  You’ve got cancer?  You have cancer.  Let us drop forever the superfluous verb.  Though I feel obligated to point out that “You have gotten better” is kosher.  Still, prefer something like “you’re feeling better” or “your health has really improved”.

Just to be on the fair side, I feel obligated to offer some useful advice for anyone who might be attempting to improve their English.  These are a couple of tricks I have found worth keeping in mind so as to look slightly smarter than I actually am.

How to Outsmart a Chimp

Let us take a look at a common mistake and reveal a simple solution to getting it right.  Lay v lie is a trouble.  It is especially compounded by the curious reality that the past participle of one happens to be the infinitive of the other.  No matter (and don’t worry about what that actually means).

The trick is to merely to remember this simple phrase: “now I lay me down to sleep”. The important part is “lay me” (verb —> object). This relationship reveals all you need to know to choose lay or lie. When you want to place something down you lay that something down. I lay my body down. I will lay the blanket down. If no thing is being lain, then you choose lie. I lie down. I was lying down. I will lie down. (Down is a direction and not a thing).

Now the tricky bit is that I lie now and I lay earlier. Confusing? Sorry. Not my fault. This is true whenever there is no helping verb. So, I was lying down. I lay down earlier. I was lying down when you called. In the end though you can probably avoid any of the more confusing conjugations by using other verbs.

Initially I was slow to parse this guideline and was constantly failing in speech.  Writing is paced more slowly and I was better able to work out the correct usage.

Merriam-Webster has a great little blurb on this distinction and its history at the bottom of its definition for lay.

Computer terms got you up in arms?  I Bit (B) two bytes (b).  I MegaBit (MB) two megabytes (mb).  No need to confuse these two confusing terms any longer.  I bit two bytes.

And just for fun, let’s talk about the usage of shim v shimmy.  I just made this one up, so be gentle: “Shims within the groove make me shimmy”.

This one is typically a spelling problem, but here is a phrase to help: “A lot of folks know well how to allot their time”.

In the end, if you are going to break the rules be certain you know the rules you are breaking.

Ok, take it for what it’s worth.  I’m going to bed.

e-evaluation

This is in response to a page (scroll to the bottom) I stumbled upon (in the traditional manner).  On this page Professor Donald E. Knuth (Stanford University) puts forward the argument that since words will traditionally lose their hyphens once they have been commonly accepted into the vernacular, we should drop the hyphen from e-mail.  Where I do believe that much of what he says is correct, I also see e-mail as a particular case where his arguments should not apply.  I will give three reasons to support this claim.

First, though it is true that words will generally lose their hyphens after some appropriate break-in period, it is not the case that all hyphenated words will necessarily lose their hyphens.  He does offer a couple of good examples of recent additions to the English language which have appropriately dropped their hyphens: nonzero and software.  However, counter-culture, counter-clockwise (and anit-clockwise), drug-addicted, free-range, ex-wife, multi-protocol, and right-click are pretty much doomed to always carry around that little extra punctuation.  No harm there.

Second, it would seem more than a little odd to talk about a breakin period.  (Is that a period of break dancing?.)  There are clearly examples where the hyphen is required for clarity.  Break-in, re-examine, and re-organize are all good examples where their unhyphenated counterparts are clumsy, to say the least, as they are parsed along the page.  Breakin, reexamine, and reorganize become difficult for the reader to read.

These two have addressed the problem of hyphenation in general, but my third point relates directly to e-mail.  The e in question is short for electronic, so what we are really talking about is electronic mail.  This mode of conjoining words is very different from compounding two words in the ordinary manner, such as door and knob creating doorknob.  As such I am inclined to think that words created in this manner are less prone to losing their hyphens than compound words created through the more traditional channels.

For these reasons—because not all words lose their hyphens, because e-mail is easier to parse on the page, and because e-mail is a special case of compounding words—we should retain the hyphen in e-mail.

As always, your thoughts are encouraged.

JamesIsIn