Computers are awesome because they can do things at speeds that humans can't. They can easily store a million bytes (1 megabyte) of data, often in less than a second. But in the hands of the wrong people, can be used for stupid things like classified NSA surveillance programs (PRISM and more) and obfuscating source code in proprietary programs such as Windows and Photoshop.
Microsoft ("Software for Microcomputers") is one of the largest software companies in the world. Bill Gates and Paul Allen started it in the late 1970's, and at first, its focus was on software-development tools. But in 1980, IBM was developing its Personal Computer, PC for short, and to make development quick, the PC's development team used lots of outside-IBM parts, including its operating system, "PC-DOS". Bill Gates, whose father was a corporate lawyer, kept the rights to "MS-DOS", the same thing under a different name.
When the cloners successfully implemented imitations of IBM's PC line in a lawyer-proof fashion, Bill Gates was ready with MS-DOS.In the late 1980's, Micro$oft and IBM worked on a successor called OS/2, even as it worked on a GUI shell for DOS that it called Windows. OS/2 did not do very well and Micro$oft was at version 3 with Windows. It was a success. Microsoft then introduced versions of Windows that did not need DOS, like Windows NT and Windows 95, eventually moving to successors of NT like XP and its successors.
In the 1990's, Micro$oft got subjected to litigation about its monopolistic practices. When Windows 3 was run on top of DR-DOS, it would emit misleading error messages. Also, for preloads, Micro$oft was suspected of doing cliff pricing, charging significantly more for 99% of installs than 100% of installs. This is different from economies of scale, where there is a much smoother decline, and it is good for keeping rivals from getting started. Thus, M$ kept later versions of OS/2 from getting very far, not even on IBM's machines.
It is telling that the only way to compete with Windows is to essentially give away one's software, as the Linux community does.
Microsoft has repeated his monopolistic success with office suites (word processor, spreadsheet, presentation, and related software), but not with much else. Its efforts to compete with the Palm Pilot and the Apple iPod both failed, and it has recently thrown in the towel about smartphones.
Linux is an "operating system" (OS), like Windows and DOS, the fundamental layer of software on a computer that hosts all the other software. Linux runs on everything from smartphones and supercomputers, and though it has not done well in desktop computers, it has done very well in servers. Linux is open source,
Strictly speaking, Linux is a kernel, the most fundamental part of an OS. But a kernel by itself is not very usable, and that is why Linux typically comes in "distributions", the Linux kernel with a lot of other software. They typically come with Unix utilities and command-line shells, and the desktop ones also come with GUI shells, making them much like Windows or MacOS X. The two most common Linux GUI shells are KDE and GNOME. KDE rather closely resembles Windows, while GNOME somewhat resembles OSX. But one can run a command-line app inside of KDE, GNOME, and the like, as one can do with OSX and even Windows.
Be careful of a double meaning here.
Hardware robots are machines controlled by computers, machines like mechanical arms. They are often used in industrial environments, and they typically do rather simple tasks. Some more recent robots include a variety of sensors to give feedback about them and their environments. This is useful for walking, for instance, because . Boston Dynamics has some cute videos of its walking robots, including robots that recover from attempts to push them over.
Software robots or bots are programs left running that do various things. A chatbot, for instance.
It is essentially giving instructions to a computer.
It is possible to program in "machine language", as it is called, the 1's and 0's that a CPU directly reads. But that is very difficult and error-prone, and other sorts of programming languages were soon invented. As a result, machine language is not used by programmers anymore.
The first kind to be invented was "assembly language", a symbolic overlay on machine language. Its statements have format (label) (opcode) (operands). The label is where to transfer control to, for instructions, or else to mark out some memory space for data. The opcode or operation code designates the operation. The operands are what the operation works on. They can come in a variety of formats, and there may be one or zero of them. For adding A and B to make C:
LOAD A ADD B STORE C A DECL INT B DECL INT C DECL INT
Assembly language has been widely used, though more recently mostly for using special features of CPU's.
A common add-on to assembly language has been "macros", sets of instructions that can be inserted into the code. Like ADD(A,B,C) for the first three instructions. If one goes far enough with macros, one gets a "high-level language". The first one was FORTRAN, with an algebra-like appearance:
C = A + B
This sort of appearance has been used ever since in most programming languages, and attempts to make programming languages look like natural languages have generally failed. COBOL, another early high-level language, uses statements like
ADD A TO B GIVING C
But it soon got this sort of statement:
COMPUTE C = A + B
As to being able to program in natural language, that requires strong-AI natural-language understanding, and we are far from that point. Natural languages are grotesquely complicated, with lots of odd rules and exceptions to rules, as anyone who has ever tried to learn one quickly learns.
Can you imagine what it’s like to be dying in some dirty place with cockroaches biting you while you are still alive? I can’t -- though Scientologists let Lisa McPherson die that way. If you’ve the stomach for them here’s the Autopsy photos.