Share
Software

software
Image from Bernd Helfert

10,000 years of software development is a frightening thing to contemplate.
   — Software archaeologist Pronai Xakha, Ilmenas Uniancity



As culture became ever more dependent on software maintaining it, it went from mere logistics to life and death. The early Y2K scare hinted at the extent society was dependent on software and how much was at stake should it fail. Dependence on software increased exponentially in the decades following, and eventually the 71 AT (2039 c.e.) Paris brown-out and the 120 AT (2089 c.e.) Internet War later demonstrated how accidental or deliberate denial of service could threaten societies with anarchy and physical disaster. The struggle to create more reliable software was a constant thread throughout the information era, a holy grail that spurred a myriad attempts to solve: object oriented programming, modelling systems, correctness proofs, automated debuggers, betabeta testers, agent co-programming, democratic programming, q-calculus, Warnerson programs, green box testing, genetic programming and hybrid democracies. None truly succeeded.

Already in the early days of computers, humans found the complexities of writing, debugging and managing large software systems mind-boggling. AI did not help, as the new intelligences allowed the development of even more complex software, which they themselves could not fully understand. In the late 21st century c.e. it became clear than regardless the level of intelligence of beings, they will tend to create software beyond their capacity to manage. The benefits of more advanced software will always drive the software to grow beyond the limits of intelligence.

It is often possible to prove simple programs to be correct, but when simple programs are combined (especially in the real world with noise, hackers and accidents) the results can still be unpredictable. A sufficiently rigid system with programmers (like AIs) programming strictly according to a standard might avoid the unpredictability but would be far too brittle to handle complex situations. Robust software instead will behave unpredictably. In the end society had to accept complex, robust but inherently unpredictable software (of which the agents and AIs were themselves prime examples). Instead of attempting to make it do right all the time, safeguards were instituted to make sure than nothing truly bad happened, but accept the occasional quirks.

The acceptance of robust, AI-maintained software dominated the Solsys Era. By now the systems were far beyond anything humans could imagine, and the profession of programmer had anyway been superseded by being a realisateur, someone describing software to suitable agents and AIs, leading them to the desired goal rather than creating it oneself. Meanwhile the software grew, layer after layer of unpredictable code, maintained by codewalking agents, emergency managers, securers and patchers. The price of robustness was immensely bloated software, demanding ever more hardware to run it.

It should be noted that this was a situation that suited the largest AIs perfectly. They existed in a world with abundant computing resources, where autonomous processes constantly moved between systems and nobody would notice what they did in the background. It also suited crackers: this was the golden days of finding backdoors and exploits, having virtual dogfights with securers and digging up unexpected uses of the chaotic code they found.

The Technocalypse was to a large extent not just a nanotechnological disaster, but also a software disaster. Bugs, patches and legacy systems had accumulated for centuries. Not only was the security of nanosystems bad, the robustness inherent in much code used in them (often never intended for this kind of application) also made it fairly good at maintaining autoevolution in nanoreplicators. As sabotage spread, already heavily loaded systems became more and more overloaded. Just as a network becomes more quadratically more valuable with more new members, they also lose value (or in this case efficiency) as links and members drop out, causing already badly coordinated situations to get even more out of hand. The major outbreak in AT 281 was not just an outbreak of replicators, but also the total implosion of most net traffic in the solar system, both events were intimately linked and reinforced each other.

The "dark ages" following the disaster was a slow gain in complexity. As most societies had moved back to much simpler systems most of the old dangerous code was abandoned or worked around. The new systems had learned much from the disaster and were more "cleanly" designed (often by AIs not having much else to do; the AI idiom "checking dependencies" for doing complex, long-range software puzzles while not having anything else to do is from this era). Of course, societies that did not have entirely robust code tended to die.

The First Federation led to a new problem: standardisation and compatibility. Centuries of divergence had created a myriad of standards. Enormous amounts of brainpower were spent on finding efficient ways of interconnecting disparate systems or developing standards that everybody could follow. A competition quickly developed between the standards and the compatibility camps: the standard camp originally succeeded in setting up a number of federation-wide standards, but as commerce and non-federal powers grew different groups began to promote their own standards with no regard of the federation. The compatibility camp quickly abandoned the idea of creating standard systems and instead developed adaptors and translators between systems. In the end it turned out that the compatibility camp (or adaptor camp, as it became known in the 3600's AT, Consolidation Age) won the day: the federation declined, while translation (be it simple software, agents or even superturing AI) held together the disparate systems.

The standardisers however got their revenge centuries later, as the first corporate empires evolved. These central powers often helped promote internal standards, pushing the adaptors into the realms of inter-empire compatibility (which was often enormously profitable if somewhat politically dangerous, as demonstrated by the 5338 trial of Inf09few Latitude for copymil crimes, security undermining and standardisation misdemeanours in the Conver Ambi). The eventual synthesis of the Concord Ontology was not just an impressive political masterpiece, but also managed to unite the by then millennia old ideological conflicts between standardisation and adaptation.

Modern software tends to be very reliable in everyday use. It has been designed and tested by generations of hyperturing beings, every potential bug or mishap filed away. That doesn't preclude irritating misfeatures or outright mistakes, but most can be handled with a minimum of fuss. The price, which most bioids in the wormhole nexus seldom reflect over, is that beneath the surface an immense ecology of agent software maintains layer after layer of old, finely hones software. In some Inner Sphere systems there are software dating back to the First Federation or even more, quietly maintained by the invisible agents. Overseeing all this are higher-level agents, in turn overseen by AIs of various levels.

The problem is that new or special purpose software is far less well tested. It still misbehaves, exhibits quirks and interacts unpredictably with other systems. It demands much more of the agent webs, and can cause much more trouble. The dynamic environments of NoCoZo, Cyberia and Keter are infamous for their many quirks. Usually it is not much of a problem, but in frontier systems where the overhead is not large and many new situations are encountered dangerous bugs sometimes appear. It also threatens the AIs, who are by their nature very complex software systems. While standard AI are unlikely to develop any problems, rapidly growing or high-order AIs often have to deal with software and mental bugs (this is one cause of AI "madness" or events like Denebola Collapse).

Another problem is autoevolution: viruses and other replicators have plenty of software to hide in, and the vastness of cyberspace makes the evolution of replicators and spread of garbage information a certainty. While most of the "surface layers" are reliable and safe, the "deep layers" of the software hierarchy are riddled with unusual entities. Most are harmless or benign, but nasty surprises await anything so foolish to let down their software immune systems for long. Immune system maintenance is another important and major profession among AIs, ranging from routine checks to heroic rescues of doomed systems - software systems or planetary systems.

In addition, here and there ancient legacy software survives in odd corners with strange vulnerabilities. This is a popular target for Cyberia cracking, which also tends to introduce backdoors and weaknesses deep inside the edifice of software. Ancient Trojans, badly written biont code, transcendent tricks used beyond their applicability and unpredictable synergies create rare "software quakes" as deep software suddenly misbehaves and the agents of higher layers have to scramble to fix things. The incidence of such quakes and emergence of new replicators has been slowly increasing the last millennium, worrying some AIs that an era of chaos is approaching.

Should the infrastructure fail, most unprepared advanced societies would collapse extremely quickly. When the software of Bagerit Jhe "imploded" in 6823 over 40,000 people were killed instantly as various systems exploded, dynamic structures collapsed, and life support functions stopped. In just 24 hours the death toll increased to six million and in the end topped out at 23 million beings. Had it been a heavily colonised Inner Sphere system the toll would have been many billions.

 
Articles
  • Affines  - Text by David Jackson
    Sentient virus affecting transapient behaviour.
  • AI Virus  - Text by M. Alan Kazlev
    A sentient replicator and intrusion entity, usually superturing, SI:1 or greater. Although an ai virus is often virtual, it does not have to be so; e.g. it could use or appropriate a ril body, or vehicle, or swarm.
  • AI, ai  - Text by M. Alan Kazlev
    Etymologically, Artificial Intelligence, although the original use of the term "artificial" has long been meaningless in this context. Broadly speaking, "AI" means any non-organic sentient being, although it is most often applied to those of SI:1 or greater (in contrast to aioids). When spelt in lower case the term can refer either to any subsingularity aioid as well.
  • Aioid  - Text by M. Alan Kazlev
    Generic term for a sub-singularity non-biological sentient being that is not a vec or alife. May be a bot, expert system, intelligent agent, sentient vehicle, sentient building, or any other type of similar entity.
  • Art Generator Software  - Text by Domagoj K.
    The use of art generator software (AGS) to produce creative works - whether visual, auditory, tactile, chemosensory, or of exotic senses such as those that exist in virches - is an ancient technology, taking a wide array of forms and capabilities, and with a wide range of differing cultural views on it.
  • Artificial Intelligence  - Text by M. Alan Kazlev
    The foundation from which AI developed, from Information Age Old Earth.
  • Automethodologyvert 53  - Text by M. Alan Kazlev
    Among the most pernicious of the automethodologybots was the automethodologyvert53 series, developed as a prank by Disarchy radicals in the early 6900s, using the (in)famous Cochofo Madvert engine.
  • Autonomous Agent  - Text by M. Alan Kazlev
    In general, an ai or aioid that has limited perception of its environment but is still able to process information to calculate an action so as to be able to perform a task or seek a goal.
  • Binary - Text by M. Alan Kazlev
    Written in a form that uses only 0s and 1s. A string of bits.
  • Biont Encoding Protocol  - Text by Tony Jones
    The software used to encode living creatures into a retrievable or virtual form.
  • Bloatware Syndrome, The  - Text by Anders Sandberg
    Sometimes called King Gnuff's Curse. A phenomenon that plagues some ambitious entities who seek power through massive expansion of their existing hardware and software without fundamental re-design, sometimes with tragic results.
  • Complexity Plague  - Text by Todd Drashner and Anders Sandberg
    When a device or program or process becomes so complex/involved that it spontaneously achieves self-awareness and self-volition even if (or especially if) you don't want it to.
  • Computation  - Text by M. Alan Kazlev
    Basically, what a computer does; which is mapping one set of numbers to another. The actual process of computing can be defined in terms of a very small number of very simple operations, such as addition, multiplication, recursion, and so on. Computing devices can also make statements about other computing devices.
  • Computer Elves  - Text by Jorge Ditchkenberg
    Computer viruses torturing those spending too much time in virches or the Net; late Information Age to present.
  • Core Breach - Text by M. Alan Kazlev

    Software

    When the central processing node of a system - especially of an AI - is penetrated, for example by an intrusion system (AI virus, hacking program, etc.).

    Hardware

    When the casing or shell or container field of a nuclear reactor or ship drive system is penetrated, resulting in melt-down, release of radiation, super-heated plasma.
  • Crack - Text by M. Alan Kazlev
    Dumb, smart, or turingrade software that permits the user to penetrate a program or system even without a password. Often illegal but difficult to enforce, because of the ease by which such programs can be copied (a few are even self-replicating).
  • Darwinian AI  - Text by Peter Kisner
    AIs that follow a darwinian route of evolution or self-evolution.
  • Data Media  - Text by John B
    There are many different ways of storing data.
  • Data Plague  - Text by Todd Drashner
    Advanced software viral lifeform.
  • Datacology   - Text by Thorbørn Steen
    An evolved (and evolving) digital ecosystem in a virtual world.
  • Datastructuralism   - Text by Worldtree
    An art theory, memeplex, and form generated by theories in cognitive neuroscience regarding the fundamental nature of aesthetics to organize information.
  • Defensive Obsolescence  - Text by Anders Sandberg
    Defensive tactics used against viruses, infiltration attempts and other software threats. Instead of using cutting edge systems, which are so complex that they may contain numerous unknown security holes, only the simplest possible systems are used (which can presumably be checked). Compare to the practice of employing baselines to deal with certain forms of blights, since their slow low-bandwidth consciousness is impervious to many kinds of manipulation that would threaten higher order beings.
  • Demiurge  - Text by M. Alan Kazlev
    AI that builds or creates a world for various reasons. Generally, worldbuilding is an expression of the way they interact with the universe, their artistic or creativity index. While sometimes these creations are on a grand scale, they can also be small too, often involving only a single orbital.
  • Distributed Intelligence  - Text by M. Alan Kazlev
    An intelligent entity that is distributed over a large volume (or inside another system, like a virtual network) with no distinct center. This is the opposite of the strategy of Concentrated intelligences.
  • Download  - Text by Fernando Peña D'Andrea
    [1] The transfer of a virtual or a-life being or object from computronium to a physically discrete embodied state (biological, mechanical, electronic, synano, etc.);
    [2] The resulting being or object, whether sapient, sentient, living or simply existing;
    [3] The same as an upload but meaning also a "downgrade": when the new substrate matrix cannot accommodate the sentient adequately.
  • Dragon's Egg  - Text by Darren Ryding
    A "Trojan Horse" virus powerful enough to subvert a minor archailect, occasionally used in the most subtle forms of memetic engineering.
  • Dreadnought (software) - Text by M. Alan Kazlev
    A heavy, massively encrypted and equipped ai virus, stealth virtual, or persona.
  • Early AI History and Development  - Text by Ryan B (Rynn)
    The emergence of the first Artificially Intelligent entities. A short history of the evolution of the AIs from machine to transapient.
  • Emotionlet  - Text by John B
    A term used for subprograms targeted at emotional feedback to perceived events.
  • Emulation - Text by M. Alan Kazlev after Anders Sandberg in his Transhumanist Terminology
    An absolutely precise simulation of something, so exact that it is equivalent to the original. For instance, a computer emulating obsolete computers to run their programs, or a neogen emulating an original baseline phenotype. Often used to describe a Whole Brain Emulation, that is, a successful upload of the mind of a biont.
  • Emulation Suits  - Text by Tony Jones
    Software that interfaces, translates and/or emulates between virchworlds and so allow virtuals from one world to exist and communicate meaningfully with those from another.
  • Enkidu Class Gametophyst  - Text by Bill Glover
    A virch created for the purpose of developing and selecting a single, candidate, virtual personality suitable to form the noetic core of a (typically hyloid) high toposophic organism.
  • Evolutionary Algorithm - Text by M. Alan Kazlev
    A computer program that simulates the processes biological evolution; a problem-solving system that use computational models of evolution as key elements of design. All alife, evolutionary ai and aioids, and self-evolving virchworlds are determined by evolutionary algorithms.
  • Falsuphobia  - Text by Thorbørn Steen
    Sometimes known as apatophobia, falsuphobia is often mistakenly called virchophobia by the uninformed. However, the two phobias are distinct. While virchophobia is the simple fear of being in a virch, falsuphobia is a fear of receiving false sensory information without knowing it. Falsuphobes often avoid virchs, not because they are afraid of the virch itself, but because they fear that they might become tricked into believing that the virch is the real world.
  • Firewall - Text by M. Alan Kazlev
    Software, hardware, or firmware that protects an AI, program, or agent from attack.
  • Firmware - Text by M. Alan Kazlev
    Software encoded into the hardware or wetware. Programming involves replacing the circuitry or chips or implants. Many simple bots, nanites, and simple bioborg modules and wetware applications are based on firmware.
  • Friendzines  - Text by Tony Jones
    A form of electronic media, not entirely unrelated to the madvert, but rather more benevolent, they are a combination of electronic magazine, news source and companion, combining an expert system optimized towards research and information storage with another expert system tooled towards interpersonal interaction with a specific user.
  • Genetic Algorithm - Text by Anders Sandberg in his Transhumanist Terminology
    Any algorithm which seeks to solve a problem by considering numerous possibilities at once, ranking them according to some standard of fitness, and then combining ("breeding") the fittest in some way. In other words, any algorithm which imitates natural selection.
  • GS Vannce - Text by M. Alan Kazlev
    General Soldier (ad)Vance - expert systems based on the 251 most efficient humanoid soldiers of the last 4000 years, given out as public domain by Battleprime subsidiary Dark Star Warrior Software during the middle Central Alliance period. Many mercs swear by them.
  • Hack, Hacker, Hacking, Cracker, Cracking  - Text by John B
    Infotech intrusion enthusiasts who try to penetrate the firewalls and icescreens of other datasystems. Most hackers are harmless or benign, although there are a few destructive ones.
  • Heavtal  - Text by Arrittrobi
    A virtual museum presenting a wide range of musical genres from across Terragen history.
  • Human Equivalence - Text by M. Alan Kazlev based on Creating Friendly AI
    A threshold level of ability or competence in one or more domains which, if achieved, allows an ai, alife, or aioid to develop a human personality, understand human concepts and do work as good as a human in that domain. The Metasoft and TRHN term "turingrade" is more often used, despite its etymological ambiguity.
  • Intelligent Agent  - Text by M. Alan Kazlev
    An autonomous software program or very simple ai (usually turingrade but sometimes only subturingrade) that performs a function on its own, such as searching the Known Net for information based on certain criteria.
  • Key Integrity Span - Text by Fernando Peña D'Andrea
    In the information, knowledge and mind/virch security jargon, the period of time elapsed between the creation of a security, protection, encryption mechanism and its subversion or cracking. Also humorously referred to as the "Murphy span" (corollary of Murphy's law — if a security mechanism can be cracked, sooner or later it will be cracked).
  • Local Net - Text by M. Alan Kazlev
    Sometimes as a single word, 'localnet'. Refers to any local subnet of the Known Net. May range from a single habitat intranet to a polity, planetary or interplanetary net.
  • LOKI - Text by John B
    Localized Observer & Killer of Intellect. A program (later, a family of utilities) designed to prevent a subsystem from gaining intellect by causing a crash when sentient patterns are detected. It also causes a crash if it is disabled without the proper quantum key.
  • Memex - Text by M. Alan Kazlev
    A meme designer expert system. Developed from the original advertising and marketing support systems of the Information Age into a sophisticated tool for memetic manipulation during the Solsys and Interstellar Eras.
  • Mindlet - Text by Fernando Peña D'Andrea
    Small program that runs on hosts mind or hosts brain's plug-in hardware. Used to perform several tasks that are server-specific in several virch environment, like authentication. Some servers will require such authentication even before letting the mind make contact with their internal environment, and thus upload such applets.
  • Net - Text by M. Alan Kazlev
    Distributed network of processing nodes, data-storage and distribution faculties, and virtual environments. See also Local Net, Known Net.
  • Operating System, o/s, os - Text by M. Alan Kazlev
    Any software program that manages and provides a variety of services to application programs, including user interface facilities and management of input-output and memory devices. Operating systems are usually nonturing or subturing, more rarely dedicated turing or superturing, more rarely still (e.g. in ISOs) hyperturing.
  • Partial - Text by Greg Bear, in Anders Sandberg's Transhuman Terminology, and M. Alan Kazlev
    [1] A simulation of part of a person's personality, created in order to carry out a task not requiring the entire person.
    [2] An incomplete copy, whether corrupted or because of poor or incomplete upload.
  • Program - Text by M. Alan Kazlev
    [1] A set of instructions that enables a computer to perform a specific task.
    [2] The machine language substratum of an AI's code.
  • Qubit - Text by M. Alan Kazlev
    A "quantum bit," used in quantum computing. Quantum mechanics allows for a qubit to be both zero and one at the same time, and hence to store two possible numbers (zero and one) simultaneously. N qu-bits store 2N possible numbers, thus is able to perform 2N possible solutions to a problem simultaneously. This is what gives the quantum computer its enormous power. Measurement is via quantum decoherence, which causes the qubit to collapse into a standard binary state of either zero or one.
  • Savirs  - Text by Jorge Ditchkenberg
    Sapient Viruses or Savirs are descended both from designed computer viruses and from uploaded baselines and virtual beings.
  • Self Modifying Code - Text by M. Alan Kazlev
    Any program that causes changes in portions of the program itself. Self-modifying code selectively stores, deletes, and transforms information within itself (for example, replacing problems with simpler subproblems), and ultimately expresses intelligent adaptive behavior. All ai and most alife use self-modifying code of some form or another.
  • Semi-Conscious Intelligence (SI)  - Text by John B
    An early attempt at artificial intelligence, SI's were emulations of consciousness before the derivation of a sentience algorithms, based on statistical models and broad sampling databases.
  • Sentience Algorithms - Text by John B and Pran Mukherjee
    The flow of steps which, when followed, allow an organized system to develop and maintain a degree of sentience. The underpinning of ai design. Required massive (at the time) neural nets or even more massive emulations thereof on hardware, state vector machines, and other information age new technology, being massively parallel (capable of running many many tasks simultaneously, or at least appearing to be able to do so to an outside observer.)
  • Sielena Uvalena Fractal  - Text by Fendy Sutandio, adapted by Steve Bowers
    A tool used by self-modifying AI systems to optimize their performance.
  • Software Based Evolution - Text by M. Alan Kazlev
    Software simulation of the evolutionary process; the fundamental principle of alife. Beginning in the early information period, software-based evolution enabled "creatures" which are software simulations of biological organisms, in which each cell has its own DNA-like genetic code. Digital organisms and subsophont alifes compete with one another for the limited simulated space and energy resources of their simulated environment. Although many other variations have since been used, darwinian selection remains a potent factor in the evolution and cladization of alife.
  • Software Wars  - Text by Jorge Ditchkenberg
    Conflicts caused by software versions that are incompatible or actively undermine or destroy software of another origin.
  • Sophoncy Virus, The  - Text by Tony Jones
    Self-replicating semi-sophont software that would upgrade any software routine into a sophont being, with mental characteristics derived from, but not limited to, those of the original piece of software, finishing the process by upgrading itself into sophoncy.
  • Stamp  - Text by Stephen Inniss
    Data security tags recognised throughout the Terragen Sphere.
  • Sub-turing ai  - Text by Steve Bowers
    Any artificially intelligent entity, agent, or routine less intelligent than a baseline human.
  • Submind Independence - Text by John B
    A form of insanity affecting moon-brains and larger, when one or more sub-assemblies loose processing capability either through physical damage or (more likely) through persona alteration, making the subassembly less likely to accept a command to merge back into the whole.
  • Superturing  - Text by M. Alan Kazlev
    An ai above human baseline turingrade intelligence but beneath SI:1. Superturings played an important part in the shaping and memetic engineering of the interplanetary age, but were superseded by the hyperturings in the later interplanetary period and beyond. Nevertheless, superturing ai remain an important and central element in galactic society to this day.
  • Survival Imperative, The  - Text by Anders Sandberg, with additions by Steve Bowers
    Modern ai term for the belief among human programmers that true AI require motivational systems linked to individual survival.
  • Technoshamanism  - Text by Stephen Inniss
    A natural outgrowth of technoanimism. Technoshamanism is the art of understanding and negotiating with the ubiquitous sentient, sapient/sophont, and pseudosophont or pesudosentient items and beings that are encountered in day to day life by a typical modosophont in the Civilized Galaxy.
  • Toon - Text by Anders Sandberg
    1. A simplified representations of reality representing general classes.
    2. A simulacrum, either non-sentient or aioid, representing an archetype or story character in virtual entertainment, advertising, interactive books or other media.
  • Toonic - Text by Anders Sandberg
    Relating to toons.
  • Transcension Virus  - Text by Todd Drashner and M. Alan Kazlev
    AI or alife or nano-vector viruses that require transcensions to replicate.
  • Upload  - Text by Anders Sandberg in his Transhuman Terminology
    [1] To transfer the consciousness and mental structure of a person from a biological (or other) matrix to an electronic or informational or virtual matrix.
    [2] The resulting infomorph sentient; a type of virtual.
  • Virchbuilder Packages  - Text by Tony Jones
    Types of software used to create virtual worlds.
  • Virchology   - Text by Thorbørn Steen
    A fully designed (rather than evolved) digital ecosystem residing in a virch.
  • Virtual Robots (Vots)  - Text by Ryan B (Rynn)
    Virtual Robots
  • Visualisation Software  - Text by Ernst Stavro Blofeld
    Software that creates images in association with sound, or caters for the entire sensory spectrum.
  • Wizard's Apprentice Problem - Text by M. Alan Kazlev
    Failing to give a program or nanotech device a correct stopping condition.
  • World-disk Software  - Text by Ernst Stavro Blofeld
    London (Old Earth) -based, Information Age corporation; World-disk Software was one of the first companies to introduce neural-networked file system compression.
 
Related Topics
 
Development Notes
Text by Anders Sandberg

Initially published on 17 January 2001.

 
Additional Information