Great Places to Start!
http://www.codecademy.com/
http://code.org/
http://www.tynker.com/hour-of-code/
http://www.codeavengers.com/
http://www.codeavengers.com/javascript/17#1.1
The last link will take you to a game. Enjoy!
Sunday, December 7, 2014
Fun With Coding
What is coding?
Coding is what makes it possible for us to create computer software, apps and websites.
The First Programmable Computer
Colossus was the world's first electronic digital computer that was programmable. The Colossus computers were developed for British code-breakers during World War II to help in the cryptanalysis of the Lorenz cipher.
How it works
A computer can only understand two distinct types of data: on and off. In fact, a computer is really just a collection of on/off switches or transistors. Anything that a computer can do is nothing more than a unique combination of some transistors turned on and some transistors turned off. Binary code is the representation of these combinations as 1s and 0s, where each digit represents one transistor. Binary code is grouped into bytes, groups of 8 digits representing 8 transistors.
Programming Languages
A programming (or coding) language is a set of syntax rules that define how code should be written and formatted. Different languages are designed to be used for different purposes – some are useful for web development, others useful for writing desktop software, others useful for solving scientific and numeric problems. Low-level languages are closer to the binary code a computer understands, while high-level languages bear a lot less resemblance to binary code. High-level languages are easier to program in, because they’re less detailed and designed to be easy for us to write. There are:
Coding is what makes it possible for us to create computer software, apps and websites.
The First Programmable Computer
Colossus was the world's first electronic digital computer that was programmable. The Colossus computers were developed for British code-breakers during World War II to help in the cryptanalysis of the Lorenz cipher.
How it works
A computer can only understand two distinct types of data: on and off. In fact, a computer is really just a collection of on/off switches or transistors. Anything that a computer can do is nothing more than a unique combination of some transistors turned on and some transistors turned off. Binary code is the representation of these combinations as 1s and 0s, where each digit represents one transistor. Binary code is grouped into bytes, groups of 8 digits representing 8 transistors.
Programming Languages
A programming (or coding) language is a set of syntax rules that define how code should be written and formatted. Different languages are designed to be used for different purposes – some are useful for web development, others useful for writing desktop software, others useful for solving scientific and numeric problems. Low-level languages are closer to the binary code a computer understands, while high-level languages bear a lot less resemblance to binary code. High-level languages are easier to program in, because they’re less detailed and designed to be easy for us to write. There are:
- Python
- Javascript
- BASIC
- C
- C++
- COBAL
- FORTRAN
- Ada
- Pascal
http://www.codeconquest.com/what-is-coding/how-does-coding-work/
Google Images
Google Images
Technological Singularity
What is technological singularity?
Ray Kurzweil believes that we're approaching a moment when computers will become intelligent, and not just intelligent but more intelligent than humans. When that happens, humanity — our bodies, our minds, our civilization — will be completely and irreversibly transformed. He believes that this moment is not only inevitable but imminent. According to his calculations, the end of human civilization as we know it is about 35 years away. There are a lot of theories about it. Maybe we'll merge with them to become super-intelligent cyborgs, using computers to extend our intellectual abilities the same way that cars and planes extend our physical abilities. Maybe the artificial intelligences will help us treat the effects of old age and prolong our life spans indefinitely. Maybe we'll scan our consciousnesses into computers and live inside them as software, forever, virtually. Maybe the computers will turn on humanity and annihilate us. The one thing all these theories have in common is the transformation of our species into something that is no longer recognizable as such to humanity circa 2011.
Vernor Vinge coined the term "Technological Singularity" or "the Singularity" in 1986 with the publish of his book Marooned in Realtime. The idea was later developed in The Coming Technological Singularity in 1993. He believes that mankind will develop a superhuman intelligence before 2030. "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended. [...] I think it's fair to call this event a singularity. It is a point where our models must be discarded and a new reality rules. As we move closer and closer to this point, it will loom vaster and vaster over human affairs till the notion becomes a commonplace. Yet when it finally happens it may still be a great surprise and a greater unknown." He believes this may happen in four different ways:
- Scientists could develop advancements in artificial intelligence
- Computer networks might somehow become self-aware
- Computer/human interfaces become so advanced that humans essentially evolve into a new species
- Biological science advancements allow humans to physically engineer human intelligence
Ray Kurzweil believes that we're approaching a moment when computers will become intelligent, and not just intelligent but more intelligent than humans. When that happens, humanity — our bodies, our minds, our civilization — will be completely and irreversibly transformed. He believes that this moment is not only inevitable but imminent. According to his calculations, the end of human civilization as we know it is about 35 years away. There are a lot of theories about it. Maybe we'll merge with them to become super-intelligent cyborgs, using computers to extend our intellectual abilities the same way that cars and planes extend our physical abilities. Maybe the artificial intelligences will help us treat the effects of old age and prolong our life spans indefinitely. Maybe we'll scan our consciousnesses into computers and live inside them as software, forever, virtually. Maybe the computers will turn on humanity and annihilate us. The one thing all these theories have in common is the transformation of our species into something that is no longer recognizable as such to humanity circa 2011.
Can We Avoid Machines Taking Over?
In 1965 Gordon E. Moore, a semiconductor engineer, proposed what we now call Moore's Law. He noticed that as time passed the price of semiconductor components and manufacturing costs fell. Rather than produce integrated circuits with the same amount of power as earlier ones for half the cost, engineers pushed themselves to pack more transistors on each circuit. The trend became a cycle, which Moore predicted would continue until we hit the physical limits of what we can achieve with integrated circuitry.
Sources:
http://content.time.com/time/magazine/article/0,9171,2048299,00.html
http://www.singularitysymposium.com/definition-of-singularity.html
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
Google Images
Google Images
Subscribe to:
Posts (Atom)