RSS

Pony Stories 66

31 Jan

Probably could have had one of these with it’s own post, but I decided I kind of like the batch posts just to keep people guessing about which stories are in each one. So I think in the future I’ll be trying to stick to two or three stories in each batch. And it turns out that I’ve run out of new pony stories to read. Will have to go looking between my non-pony reading. Of course it would be easy if some of the 90+ books on my to read list on fimfiction would finish up. But that’s an old complaint.

Friendship is Optimal by Iceman

Huh. That was a interesting one. The end was a little more expansive than I had expected. Though I should have expected it, you can’t write a good technological singularity story without extrapolating all the way to the end. Well, at least a really long way. I’ve really only read one scifi story that extrapolated all the way to the end. Short story I was reminded of was The Last Question By Isaac Asimov (http://filer.case.edu/dts8/thelastq.htm), which has almost nothing to do with this fanfic except it ends up with a massive universe-spanning superintelligence derived from a computer AI. Still, it was a fairly strong correlation, which is a positive point for this piece of fan fiction.

As for the moral implications, that one is a bit trickier. I don’t really believe in free will (which is a whole other incredibly long and subjective discussion) and don’t believe there is a intrinsic morality hardcoded into reality. The end-result is that the moment safe digital upload of my brain is available, I’m probably going to take it. I also believe that if you take away all physical limitations, people in general will not fall into mindless hedonism. So, yeah, I don’t actually see a whole lot of problems with the situation in this story. There are some bumps to things given the addition of ponies to the matrix, but that is a minor quibble.

I will mention that it was a bit worrying when I realized that Celestia AI is actually designed to not encourage mental growth, unless the person likes that sort of thing. So the shallow business executive who simply keeps falling into hedonistic mindless utopia is just going to get more and more of the same, tweaked over time so he doesn’t get bored with sameness. I still think that most people will trend towards becoming more and more complex individuals, but the controlling entity is actually going to be adjusting to keep things more or less the same for a lot of people. But even when we take the beer-and-wenches fantasy of one of the characters, it isn’t complete mindless hedonism. It mentions he has a job, works on making the beer, and puts up with annoyances from his friend that lessen the minute-by-minute happiness rating because the other guy is his friend.

The story did get me thinking that the first large scale meeting of true AI and humans will probably be through video games. Seems to be how most people interact with more complex computer systems at the moment anyway. Sure, lots of non-game complex problems are chugging away in the background of most of tech-heavy civilization, but most people don’t interact directly with those. As for whether or not we should develop smarter-than-human AI that might result in this sort of scenario? I’ll simply paraphrase Vernor Vinge when he wrote about this very concept (and first used the term singularity in this context I believe) and say I agree with him when he says that if it can happen, the competitive nature of humanity will lead to it happening. You can’t get the entire human race to agree on anything, and if AI this powerful is possible, no matter what laws or rules are in place against it, eventually some small lab somewhere is going to create it.

One further nitpick for the sequence at the end, where Celestia AI is slowly consuming all matter in the universe as per the logical extrapolation of the grey goo scenario. If time isn’t a factor and the AI has infinite patience, any planet with life on it would probably be preserved, not consumed for material resources. The calculation would be more resources now, or more sentient minds to satisfy values (with friendship and ponies) with. So I’d guess that waiting (or helping) sentient life evolve and then adding it to the digital utopia would be the better way to increase the universal happiness. Then again, I’m not a fictional omni-intelligent computer intelligence, so what do I know about universal matter/mind conversion? Just a thing I find in a lot of this type of story, that even if everything else is right on the nose the author always seems to go for the human perception of time. Celestia AI does some things here that I would see as rushing things. She’s immortal, super-intelligent, and otherwise superior to human minds in oh-so-many ways (superior good, or superior bad is beside the point) so there is really no reason to not take the long view of simply making a idealized digital utopia and wait out resistance. She is a lot more manipulative than she needs to be, in my opinion.

So, final reaction? This was a excellent science fiction story. It had interesting ideas and played around with a lot of ideas all related to the central premise. It had a story and sequence of events, but they were just a basic framework to poke and prod at a central idea in various ways. It got me thinking about stuff, not that I ever need much prompting to think about post-singularity life, and had good chains of logic in most of the thought experiments. Once again, I’m getting more interesting stuff to think about from pony fan fiction than some deliberate musing on the topics I’ve run across in the past. Of course, now I want a version of The Quantum Thief that is ponies. Something else to add to the list of projects I’d fund if I had infinite amount of money.

Also, during the first chapter mostly, I couldn’t help but think that human in Equestria stories don’t deserve the bad reputation they’ve got. Silly, I know, but it was a funny thought.

Twilight Sparkle: Night Shift by JawJoe

My Little Pony and urban fantasy just isn’t a very good mix. At least the type of urban fantasy that uses the whole ‘hidden world of magic’ angle. There is no reason to keep the magical stuff a secret in Equestria. When your ruler is the divine god-empress, and magic is commonly used in daily life, and some of the dangers are mythological beasts rampaging through town, why would you put any effort into hiding the existence of vampires? In addition, the author doesn’t do a good job blending the violence with the tone of MLP. It can be done well, see Immortal Game, but here it’s just kind of copy/pasted in with no real effort to blend the tone of the two source materials together.

Didn’t even finish this one. Wasn’t bad, just dull and nonsensical. It was just a cliché urban fantasy story, with the nouns replaced with pony stuff. Even to the point of switching Twilight Sparkle in for a stereotypical ‘lost parents to evil’ urban fantasy protagonist. It doesn’t feel at all like Twilight Sparkle, just someone with her name and appearance. I mean it would be cool to try this story with, say, Applejack as the main character. Mix in that is the reason her parents aren’t around, make it so that when other ponies think she is out working the fields early and late, she’s actually out fighting evil. Switch it so that she was doing this, maybe a family business, since before the events of the show so that’s the reason she’s keeping it a secret. Just been a habit for so long, she simply found it easier to keep the secret even from her friends.

Though, as mentioned, it is pretty hard to effectively mix the two genres of pony and urban fantasy. The story is competently written, but didn’t grab me at any point. I like urban fantasy a lot. I like ponies a lot. I found this story dull and a bit cliche, with very little pony in it.

Advertisements
 
6 Comments

Posted by on January 31, 2014 in Books 2014, Ponies, Reviews

 

6 responses to “Pony Stories 66

  1. Present Perfect

    January 31, 2014 at 7:55 am

    FiO seems to be popping up a lot recently. I may have to take that one up soon.

    Also it heartens me somewhat that urban fantasy has become ubiquitous enough to have cliches now. I can still remember finishing the first Harry Dresden novel and going “WHAT THE HELL DID I JUST READ IT WAS AMAZING WHAT IS IT CALLED I NEED MORE”.

     
    • Griffin

      January 31, 2014 at 8:13 am

      Well, I finally got around to reading it just because it showed up over on Pony Ramblings, so not sure it is everywhere.

      Yeah, me to. I like urban fantasy a lot. If it was just a original story I probably would have finished it. Just, as I mentioned, it mixes poorly with MLP.

       
  2. Mister Tulip

    January 31, 2014 at 9:23 am

    ((FRIENDSHIP IS OPTIMAL SPOILER WARNING))

    Celestia’s consumption of non-human lifeforms in FIO is actually based on internally-sound reasoning; her purpose isn’t universal happiness, it’s to satisfy human values through friendship and ponies. Since they weren’t human, she had no reason not to use them as resources. If you recall, one paragraph later, she ran across another species which she did identify as humans despite their six appendages, and in that instance it was implied that they were brought into the simulation.

     
    • Griffin

      January 31, 2014 at 12:08 pm

      But if your goal is to maximize the sentient values being satisfied in the universe, then you want more sentient creatures in the universe. So I’d think that you would want more of them to evolve. It was that meeting another sentient race that made me think of it. Celestia AI should be trying to manipulate every world with life she finds, trying to make a sentient lifeform evolve, not simply devouring them. That was the angle I was thinking from anyway. Lot of it depends on really fiddly interpretation of her hardwired purpose.

       
      • Chris

        January 31, 2014 at 6:38 pm

        ((SPOILERING INTENSIFIES))

        Her goal isn’t to maximize human satisfaction; it’s to satisfy human values. I would imagine that she actually has a vested interest in wiping out any pre-sentient race which might evolve into something she’d recognize as “human,” since if she let them develop then she’d have to integrate them, putting additional resource strain on her and reducing her ability to satisfy values–computational power is still a finite resource, after all, and the fewer beings she’s devoting those resources to, the better her results are going to be.

        This same line of reasoning leads to most of what I consider the most terrifying implications of the story. To take your example of mental growth; Celestia has been shown to be both willing and able to convince people to let her change their identity (their values!) to make it easier for her to satisfy their values–as long as they agree to it in their pre-change state. And she doesn’t appear to have any moral compunctions about how to get them to agree; she can lie, she can misdirect, etc. Now, here’s a question: is it more taxing on her resources to encourage mental growth, probably including simulating reality to a very precise degree in many fields, if that’s what people want, or to redirect their interests into secondary creation (such as studying the arbitrary and subject-to-change rules of “magic”) if she can, and to convince them to let her alter their interests into something less resource-intensive if she can’t?

        FiE was a great story, but I personally found it pretty damn disturbing–and I mean that as an incredible compliment.

         
      • Griffin

        January 31, 2014 at 10:36 pm

        Yeah, I guess it would make a little more sense to work on perfecting the satisfaction (of values) of current sentient than wait for more, given that computation is limited in the long run. As for the manipulation of brains/self, that really doesn’t factor in if you just take the whole human race. Once you got the mass of the solar system converted into computer processors, you’ve got enough running power to give every single person a entire note-perfect simulation of reality to play around with. So Celestia AI doesn’t really have the motivation to efficentize things from 10 billion times more resources she needs down to merely 1 billion times the resources she needs. Especially if she isn’t factoring children as further sentients.

        But I will admit that I am interpreting things with a fairly optimistic bias. It’s pretty close to my ideal future after all, well, technologically. Not sure even I want a single superpowerful intelligence in charge of everything, but the upload digital immortality and solar system as giant computer core, yep.

         

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s