Prosthetics NTA (Neural Tactical Aid)

Discussion in 'Denied' started by Teldrassil, May 26, 2020.

  1. Teldrassil

    Teldrassil Galactic Officer

    Joined:
    Mar 24, 2020
    Messages:
    235
    Likes Received:
    375
    Name: Neural Tactical Aid

    Description: A brain implant that houses a non-sentient A.I that acts as a tactical assistant in combat.

    Abilities: Keeps track of numerous things in combat and displays them as an overlay on the bearer’s vision. These being: A prediction line stemming from the shooty-part of a gun (all guns, including the bearer’s) it appears as a semi-transparent beam of red that points to where the bullet would end up if the gun is fured; it makes a little flash appear where bullets have hit; it pinpoints what it believes to be weak points on armour and highlights them in red; keeps track of how many shots a gun has fired since its last reload and slowly forms an estimate of how many shots the gun fires before it needs reloading and how much ammo it thinks it currently has (this appears as a little x/y over the top if the gun) and it scans the area for people, highlighting them in red for a couple seconds; and keeps track of the bearer’s vital signs.

    Conditional Abilities (Optional): If the bearer has any prosthetics, the NTA will alert the bearer to any damage sustained to it.

    Limitations: Slight delay on overlay. Doesn’t store data for long periods of time, so has to relearn stuff like gun magazine. Doesn’t account for projectiles like grenades that shoot in an arc. Susceptible to EMPs and can cause the bearer to have difficulty with vision and to go into shock when hit with an EMP. Not always accurate - easily deceived.

    Conditional Limitations (Optional): Never turns off, which means it’s very easy to get overloaded with information that the bearer doesn’t need or want, for example: walking through a crowd, everyone will be highlighted in red for a bit and the beam stemming from guns never disappears - even when holstered.

    How does it work: Brain chip that has an A.I. in that picks up on brain - mainly from the eyes, scans them for relevant data and then outputs it’s own signals for the eyes, allowing for the overlay. Runs on brain electricity.

    Flavor text: Originally created by MiniKnog engineers to enhance soldiers and was quickly cannibalised by people in the Fringe.

    Attainability: Open

    Tags: [Military]

    Category: Cybernetics
     
    Last edited: May 26, 2020
  2. 9K

    9K Galactic Officer Staff Member Administrator

    Joined:
    Jun 27, 2017
    Messages:
    324
    Likes Received:
    346
    Hey. I'm going to be starting grading on this today.

    I have a few balance problems with this application.
    The first is just the idea of scanning an area for "people," the concept of which I have an issue with, on top of how vague the application is at the moment. How does it scan for people? What is it scanning to find them? How far can it scan? Can it scan through walls? How about light cover, like a pile of leaves?

    How does this thing distinguish a gun from anything else? How far away can it detect a gun? If a person is aiming a rifle out of a window, otherwise out of sight, would someone be able to see the beam on them because the AI can detect it even if the user can't?

    The reason we have to be very careful about apps like this is because it opens the door to use of kinda meta-knowledge and allows people to potentially get away with asking something such as, "hey what's the weak point on your armor?" and seeming justified in asking this even though it's apparently easily fooled just because they have this implant. Among other examples of use of meta-knowledge.
     
  3. Teldrassil

    Teldrassil Galactic Officer

    Joined:
    Mar 24, 2020
    Messages:
    235
    Likes Received:
    375
    Updated with said criticisms in mind, made the A.I. more or less do stuff that a human is capable of doing with enough practice, they just do it for them. Cut the armour part after going through the logic of the tech and realising it didn't quite fit. Reformatted slightly, for ease of reading.

    Name:
    Neural Tactical Aid

    Description: A brain implant in the form of a grey chip that houses a non-sentient A.I (coded with access to a small database of knowledge i.e. "a gun looks like this") that acts as a tactical aid in combat.

    Abilities: Keeps track of numerous things in combat and displays them as an overlay on the bearer’s vision. These being: A prediction line stemming from the shooty-part of a gun (all guns, including the bearer’s) it appears as a semi-transparent beam of red that points to where the bullet would end up if the gun is fired; it makes a little flash appear where bullets have hit; keeps track of how many shots a gun has fired since its last reload and slowly forms an estimate of how many shots the gun fires before it needs reloading and how much ammo it thinks it currently has (this appears as a little x/y over the top if the gun) and it scans the area for people, highlighting them in red for a couple seconds; and keeps track of the bearer’s vital signs.

    Conditional Abilities (Optional): If the bearer has any prosthetics, the NTA will alert the bearer to any damage sustained to it.

    Limitations:
    -Slight delay on overlay - roughly a 0.1 second delay.
    -Doesn’t store data for long periods of time, so has to relearn stuff like gun magazine.
    -Doesn’t account for projectiles like grenades that shoot in an arc or guns that have a spread to them - like a shotgun.
    -Susceptible to EMPs and can cause the bearer to have difficulty with vision and to go into shock when hit with an EMP.
    -Not always accurate - easily deceived as it runs off of a "in sight, in mind" basis.
    -Features such as the prediction beam and magazine tracking have a max of 8 guns tracked - the features share a gun slot; the A.I. prioritises the closest guns.
    -Kind of annoying, say you get a prosthetic hit or your vitals aren't doing so good - it makes your vision flash red and shove a message on the topic right in the centre of your vision.

    Conditional Limitations (Optional):
    -Never turns off, which means it’s very easy to get overloaded with information that the bearer doesn’t need or want, for example: walking through a crowd, everyone will be highlighted in red for a bit and the beam stemming from guns never disappears - even when holstered.
    -Doesn't track anything over than guns whatsoever.

    How does it work: Brain chip that has an A.I. in that picks up on brain signals - mainly from the eyes, scans them for relevant data and then outputs it’s own signals for the eyes, allowing for the overlay. Runs on brain electricity.

    Abilities:
    -Prediction Beam. The A.I. identifies the gun through cross-referencing it's appearance with what it already knows, so if it looks like a gun... that's definitely a gun (even water pistols), easily deceived by sticking a cloth over a gun or making the gun looks less gun-like. Makes an attempt to pin-point where the shooty part is and then once it has decided where it thinks it should be, sends the beam out.

    -Hit Location. When a shot is fired, it tracks the projectile across the user's vision and when it thinks the bullet has hit something, it places the flash - but only does that once per bullet. For example: Bullet is fired, tracking, tracking, tracking, tracking, bullet has hit a shirt hanging from a clothes line, location marked - bullet has stopped being tracked.

    -Magazing Tracking. Pretty simple, keeps a counter for each gun, does a bit of maths and then puts a counter at the end of the gun, above the start of the red beam.

    -Life Sign Tracking. Scans brain signals and gives an output.

    -People Highlighting. Stores data on what a "person" looks like, similar to what it does with guns and then crossreferences that with what the bearer is seeing - if they see such a thing, it gets highlighted in red. What it thinks a person looks like is 2 arms, 2 legs, torso and a head; anything else is not a person as far as the A.I. is concerned. If a person is too far away for the human to properly make them out(so a sniper would be a blip on the distance) then it doesn't track them. If a person hid in a bush or something, the A.I. wouldn't pick up on it - even if they could see the person's face; it's missing 2 arms, 2 legs and a torso. Because it relies entirely on what the bearer can see with their eyes, it scales with their eyes. Take a normal human with normal human vision, it's going to be limited in tracking because of what that person can see - but say we had a cyborg with thermal vision, it uses that; which would probably allow for it to see that the person in the bush has all the person parts and it would highlight them. Take another cyborg, they have zooming in eyes, if they zoom in on a position and they see what the A.I. deems a person, it highlights them. The better the bearer's vision, the better the feature.

    Flavor text: Originally created by MiniKnog engineers to enhance soldiers and was quickly cannibalised by people in the Fringe.

    Attainability: Open

    Tags: [Military]

    Category: Cybernetics
     
    Last edited: May 27, 2020
    Moon Moth likes this.
  4. 9K

    9K Galactic Officer Staff Member Administrator

    Joined:
    Jun 27, 2017
    Messages:
    324
    Likes Received:
    346
    Sorry it's taken so long to get back to this.

    I'm having trouble accepting this application, honestly. I still have some issues with whether or not even the concept of it can be worked into anything that's balanced, functional and isn't completely at the mercy of person-to-person interpretation, because it has all of the means to be very, very useful, but also plenty of ways to just maybe fool it that may or may not work, which is one of many things that would be completely up in the air.

    The issue is that there are a lot of "maybes" that either make the app useless and more of a hindrance, or very, very good depending on what the user wants to do with it, without much way to determine what it really should be doing, which lends itself to functioning perfectly when it's convenient and maybe causing some borderline metagame problems.

    Just as an example, if I have a character looking at someone with a gun and I want it to keep track of someone's magazine, and I'm using this application, it's kind of up to me whether or not I want the reading of its ammunition count to be accurate. The distance it can sense a person is vague right now too, which lends itself to the previous issues too, only really strengthened by the "in sight, in mind" nature of the app where if you can perceive it, so would the implant.

    The other thing is just that I don't think there's much of a way for a brain implant to detect damage on a prosthetic unless it can feel pain anyway, because if it can't, then the only signals going on between the two are going from the brain and to the arm, not going from the arm to the brain for the chip to read. While damage could easily be detected if pain was in the prosthetic too, with pain, you know there's damage anyway.

    The "in sight, in mind" thing also kind of contradicts itself a little bit in the app. Bullets move fast. We can't perceive them moving all the way to their target generally, but the app states this implant is tracking them, and presumably through your own eyes that biologically can't perceive the bullet moving.

    I hate to fail an app, but I don't really think passing this would have good implications for the setting as far as balance or consistency.
     
    Last edited: Jun 15, 2020
    Teldrassil likes this.
  5. Teldrassil

    Teldrassil Galactic Officer

    Joined:
    Mar 24, 2020
    Messages:
    235
    Likes Received:
    375
    All fair points. I'm perfectly happy with that grade. Thanks for reviewing the app!
     
  6. Skid

    Skid God Incarnate Staff Member Community Monitor Diamond Donator

    Joined:
    Jun 26, 2017
    Messages:
    493
    Likes Received:
    224
    Seconding the fail.
     
    Teldrassil likes this.