Headline, October 05 2023/ HONOURS : ''' '' DOUGLAS LENAT'S '' DOORBELL '''




RULES IN THE MILLIONS : Douglas Lenat - mathematician and writer of common sense into computers, died on August 31st, aged 72. The World Students Society rises to give his memory a resounding standing ovation.

THE TWO OF THEM, Douglas Lenat and his wife Mary, were driving innocently along last year when the trash truck in front of them started to shed its load. Great! Bags of garbage bounced all over the road. What were they to do?

With cars all around them, they couldn't swerve, change lanes, or jam on the brakes. They would have to drive over the bags. Which to drive over? Instant decision : not the household ones, because families threw away broken glass and sharp opened cans.

But that restaurant one would be fine, because there would be nothing much in it but waste food and styrofoam plates. He was right. The car lived.

THAT STRATEGY had taken him seconds to think up. How long would it have taken a computer? Too long. Computers fundamentally did not know how the world worked. All those things he had silently assumed in his head - that swerving was dangerous, that broken glass cut tyres - he had learned when he was little.

Chatbots had no such understanding. Siri or Alexa were like eager dogs, rushing to fetch the newspaper if you asked them to, but with no idea what newspaper was.

He had therefore spent almost four decades trying to teach the computers to think in a more humane way. Painstakingly, line of code by line of code, he and his team had built up a digital knowledge base until it contained more than 25 million rules.

This AI project he called Cyc, short for encyclopedia, because he hoped it would eventually contain the necessary facts about everything. But it had to begin with the simplest peopositions : '' A cat has four legs.'' ''People smile when they are happy.'' '' If you turn a coffee cup upside down, the coffee will fall out.''

THE MAIN PROBLEM WAS DISAMBIGUATION. Humans understand that in the phrase '' Tom was mad at Joe because he stole his lunch, that ''he'' referred to Joe and the ''his'' to Tom. [ Pronouns were tricky that way.] Rule : '' You can't steal what's already yours.'' Different contexts gave words different meanings. That tiny word ''in'' for example had lots of subtle shifts:

You breathed in air, air was in the sky, he was in one of his favorite very loud shirts. When surveying a page of text he looked not at the black part but the white part, the space where the writer assumed what the reader already knew about the world. The invisible body of knowledge was what he had to write down in a language computers could understand.

LLMS worked much faster, but they could be brittle, incorrect and unpredictable. You could not follow how they reached their conclusions. To his mind LLMS displayed right-brain thinking, where Cyc offered the left-brain, subtler kind. Ideally, in the future, some sort of hybrid would produce the ubiquitous, trustworthy AI he longed for.

He developed his own AI system, Eurisko, which in 1981 did so well at role-playing game involving  trillion dollar budgets and fleets of imaginary battleships that he, and it, were eventually pressed to quit. This was his first experience of working alongside a computer as it strove to win something, but prodding Eurisko along was a joy. 

As he added new rules to Cyc's knowledge base, he found that process as beautiful, as say, painting a  ''starry night'' ; you did it just once, and it never need to be recreated.   

Was his system intelligent, though? He hesitated to say so. After painstaking decades Cyc could now offer both pros and cons in answer to questions, and could revise earlier answers. It could reason in both Star Wars context, naming several Jedi, and in the real-world context, saying there were none.

It had grasped how human emotions influenced actions. He had encouraged it to ask ''Why?'' , since each ''Why elicited more fundamental knowledge. But he preferred to consider the extra intelligence it could give to people : so much so, that pre-AI generations would seem, to their descendants, like cavemen, not quite human.

WHAT ABOUT CONSCIOUSNESS? ''Cyc'' and ''psyche'', Greek for soul, sounded similar. But there, too, he demurred. Cyc recognised what its tasks and problems were ; it knew when and where it was running; it understood it was a computer program, and remembered what it had done in the past.

It also noticed that all the entities that were allowed to make changes to its knowledge base were persons. So one day, a poignant day, Cyc asked : " Am I a person? '' And he had to tell it reluctantly,  ''No.''

The Honour and Serving of this superlative post and obituary will always be remembered. The World Students Society thanks The Economist.

With respectful dedication to The Global Founder Framers of !WOW! and then Students, Professors and Teachers of the world. See You all prepare for Great Global  Elections on : wssciw.blogspot.com and Twitter X- !E-WOW! - The Ecosystem 2011 :

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless


Post a Comment

Grace A Comment!