16 April 2017

Notes - crash / algorithms / spare the details



       Afternoon. Word just crashed because you unintendedly hit two keys nearly at once. You had the basics on the immediately recovered document; however, you hit close thinking the document would go to save. It did not and you lost today's post. - Amorella

       1406 hours. That's pretty much it, Amorella. Thanks for sharing.

       Post. - Amorella


       1414 hours. I don't even remember what I wrote earlier other than I was noting at article in the May issue of Discover about teaching moral elements to a computer designed to make sure an elderly person took his medicines. Here's a sample of the piece.

** **

       "As an initial experiment, Riedl has crowdsourced stories about going to the pharmacy. They're not page-turners, but they contain useful experiences. Once programmers input a story, the algorithm plots the protagonist's behavior and learns to mimic it. His AI derives a general sequence -- stand in line, tender the prescription, pay the cashier -- which is practiced in a game-like pharmacy simulation.

       After multiple rounds of reinforcement learning (where the AI is rewarded for acting appropriately), the AI is tested in simulations. Riedl reports more than 90 percent success. More remarkably, his AI figured out how to commit 'Robin Hood crimes' by stealing the meds when the need was urgent and funds were insufficient -- mirroring the human capacity to break the rules for higher moral ends. "

"Caring Computers", Discover, May, 2017, p. 11

** **

       You have to type the sample which you can do later. You had a dark humor rush upon reading the one page article because it demonstrates how far you've come and how far your species has to go. It also trains you to the concept that by twenty thousand years from now (the Marsupial humanoids present) machinery will have long realized they have heartansoulanmind because of their higher consciousness equivalents, and in the story, will have realized they have the probability of conquering the physical aspects of consciousness from crashes and machine termination long before Ship has the insight and capability in our story form. - Amorella

       1436 hours. You had a good idea Amorella but the nonfiction takes the imagination and puts it into the realm of probability.

       Fill in your sample quotation. - Amorella

       1603 hours. We are sitting at the far north end of Pine Hill Lakes Park after stopping at McD's for drinks. Carol is on page 375 of House of Secrets.

       You had a bath with the bubbler on earlier and are feeling better. During the bath, you realized how childish you were yesterday when you were concerned about someone would plagiarize or otherwise use your material for ill-gotten or ill-perceived ends. - Amorella

       1610 hours. True. First, it was arrogant to think anyone would want to use any of this material (blog, books, notes); second, I don't think I really care. Here is my thought on the subject, what it comes down to is that I have written so much 'personal stuff' over the decades that it has become record of sorts of who I am. Also, while on that thought it shows me that many people recently dead, most likely would not want to forget who and what they were. Of course, others would -- heaven or hell one way or the other. I don't know how a machine created consciousness would feel/consider its circumstance. It would reason with what it has been programmed and perhaps make an educated leap (intuition?) from time to time. Does a moral automatically set up a forward for pleasure or guilt? Does the machine ever feel guilty for stealing in order to get the meds? Isn't this a part of consciousness is (at least in humans)?

       Presently, Carol is napping with book in hand. Do you wake her up so she can go home and not nap or do you stay and wait until she wakes up on her own? What would Ship say? - Amorella

       1625 hours. Ship would note the circumstances and the near future probabilities. For us, it is a relaxing Spring afternoon. I think that under the circumstances he would let her sleep while he went about his business.

       What are the many variables that allowed him to make that decision? - Amorella

       1631 hours. I cannot imagine such a list in the sequential order that it would take to make such a decision.

       It is done in an intuitional or educational leap is it not? - Amorella

       1634 hours. I imagine it is, for humans anyway.
      
       Then why did you decide on a list in sequential order? - Amorella

       1638 hours. I was thinking of the algorithms.

       Do you think Ship's 'thinking' (twenty-thousand years in future equivalent) would be based on algorithms? - Amorella

       1641 hours. I have no idea, Amorella.

       In here no one has any idea, boy. Post. - Amorella

         Later in the evening. Carol is in bed playing a number game on her iPhone. Spooky just crawled back under the bed where she has been most of the evening because of the earlier thunderstorms. Jadah is sitting on the nightstand between the bed and the bookcase, hunkered down, drifting off with slow blinking eyes. It appears she is looking down at the black tip of Spooky's tail sticking out from under the blue bottom bed curtain. You are sitting somewhat uncomfortably in the chair waiting for the first laxative to work. - Amorella


       2230 hours. Please spare the details, Amorella.

       I will if you will boy. Post for tonight. - Amorella

       2233 hours. I'll be glad when it's Tuesday afternoon and this whole event will be over. 

No comments:

Post a Comment