a question got asked in the comments recently, about wiping memory for robots. basically, trying to ‘unsee’ the current scene. i think it’s an ability we’ve all wished we had at one point or another, but is it relatable?
it’s an interesting question, that i think alot of writers and creative types have to ask themselves. ‘am i making a story that’s sci fi, or am i exploring sci fi, with a story’. it may be a subtle difference, but for me it’s the same distance between your pc manual, and a nursery rhyme. and i’m not slaggin hardcore science fiction, but it can be a little dry. then again, go too far the other way, and ‘love’ is the 5th element, or you can will yourself into another dimension outside of time, to send notes back to your daughter through a wormhole….. and maybe science isn’t THAT forgiving just cause you’re the protagonist. i digress.
where am i going with all this. well, if you’ve noticed in my story, some people and robots, have advantages over one another. but, to an extent, i like the lines a little blurry. robots aren’t necessarily stronger than humans, nor do they run around saying stuff like ‘what is…. love?’ and i think this makes them more relatable.
the way i see it, emotions are a shorthand for logic. tell someone exactly why you love your friend, you may not have a quick and clear answer. maybe they helped you out of a jam or two, or you shared something private. maybe it’s completely ineffable and they are all you know, and that bond is mutual. if an AI was limited, as in they didn’t have complete lightning quick access to all their memories at any one given time, as are humans, they might develop the same sort of subroutines. ram vs rom. ‘i remember that day in the park, and that was nice’, but instead of feeling the entire thing at once, i’ll just remember you’re important to me. +1 affection. i see this kind of sketchy logic as essential to feelings. and while it might seem a little illogical for machines to adopt the sort of things we do to make quicker decisions, i think it’s a likely first step to actual artificial intelligence.
and bonus, it’s imperfect. they’d fit right in.
and, as i consider the complexities of human thought, and how that might compare with how a robot might process it’s ‘thoughts’, it’s an odd coincidence that i’m having computer problems right now. you see, i updated a driver on my drawing tablet tonight. that didn’t go well. the installed software went 50 ways, and without realizing it, made itself so integral to my machine, that i thought i might have to reinstall the OS. i didn’t, but just trying to get my machine to see that it was hooked into a tablet was a major effort. and if something that simple turned out to be that difficult, maybe scrubbing a robot of a single memory wouldn’t be any easier than trying to get someone to control a bad habit. and we’re not that good at that either.
so, to wrap up this thought, i suppose i’ll just say that i use the complexity of science fiction to make robots more relatable. because if anything will make you feel akin to something else, it’s the feeling of being out of control, in over your head, and it’s half your own fault for not being able to process what’s going on.
not recommending this movie. thing is, i was trying to think of an example of when AI seemed very relateable to me, but i couldn’t think of an instance. comment below if you think of one. i spent all of 2 minutes, so, i really didn’t put alot of work into it. sooooo, i decided to go the other way, and point out a movie where it totally did not work, and maybe why sci fi geeks freaked out at what hollywood did to the work of isaac asimov.
first off, robots in this movie are clearly physically superior. that’s just a given, and being that they have superhuman strength, strike one. secondly, only one robot displays the burgeoning sentience that they keep prattling on about, and he was programmed that way, to lead unevolved robots like some sort of mechanical jesus, but again, he didn’t evolve, he was gifted his sentience. and what does he do with it? he’s manipulated into a plot to kill his creator in some lazy plan to make a point. like a machine, not a character.
sonny, the main Ai character seems to be written to point out all the ways in which humans and robots are different, undermining the main point. drawing constant differences between characters that are working towards a unifying goal seems counterproductive to me, and if the character has no growth, or realization, or arc, then he’s still an object, and the movie has contradicted it’s own point. then again, will smith didn’t exactly break out of his comfort zone on this one. still, i guess hollywood gives us what we demand. a movie about robots becoming self aware, so will smith can mow down hundreds of them for a runtime of 115 minutes, and a budget of $150 000 000.