Spotlight

What Regulates And Defines Artificial Intelligence?

artificial-intelligence

For most people, AI (artificial intelligence) is a pipe dream that became a reality in their favorite movies. From Star Trek’s Data and Terminator T-800 to Ex Machina’s Ava, there has never been a lack of imagination when it comes to AI, whether they are humanity’s ally or foe. However, the definition of an AI is not something that is easily answered for a computer scientist. To be fair, AI do not necessarily think the same way as humans and the fact that most of them wear a human façade in the movies is simply a way of making them more ‘real’ and relatable as opposed to actually making them realistic.

Artificial Intelligence

For one, AI can be computer programs designed to engage in a specific mission with the sole task of finishing it perfectly. For example, AlphaGo was designed to beat the human world champion at Go and Watson was designed to beat humans at Jeopardy, though neither of these AI acted or ‘thought’ out its moves in the game through the way humans do. Instead, predetermined pattern recognition software aided the Ai in accurately guessing their opponents move in a way humans cannot. Rather than saying Ai think and execute tasks like humans, it would be more accurate to say they think and act rationally.

These days, scientists are defining AI as carrying out tasks that humans do automatically and subconsciously on computers. Converting common sense into binary code is one of the most difficult snags scientists have come across, despite the exponential growth of computer systems. Teaching machines how to live in 3D when their existence is based on converting all their sensory input into 0’s and 1’s is a task of no little importance. Newer AI are programmed to learn through interaction, however their shortcomings when it comes to making spontaneous choices has not gone unnoticed. The upside is that there are AIs that can handle certain tasks very well but their limited situational use is slated to expand soon.

Interesting Facts About The Invention Of The Microwave Oven

Almost all the homes of today have a microwave oven. It saves a lot of time and most importantly your efforts. The device can offer you an array of dishes with the touch of a few control buttons. It was invented during the times of the second world war. But many were apprehensive owing to the extensive and expensive design of the microwave. Not many were inclined to use the device because of the radiation that was involved. But today, the fears have faded away with technological advancements. Many households currently have more than a microwave oven to streamline their lives.

There is an interesting history associated with the invention of this
ubiquitous gadget. A skilled engineer, Percy LeBaron Spencer was working on vacuum tubes and magnetrons to produce microwave radiations. They were intended to be utilized in radar systems. Spencer was found to be efficient in producing more emissions. One day during his lab sessions, chocolate was dropped accidentally onto the magnetron which began to melt. The curious Spencer wanted to extend his experiment sessions. He began testing the other food items. The popcorn kernels started to pop, and the eggs began to explode too. Spencer made a quick interpretation.

The brilliant chap soon found out that the low-density microwave energy did cook the foods. Spencer was ready for his next set of experiments. He constructed a small metallic box and released microwave radiations into it. The inquisitive mind found that foods were cooked faster. Soon, in 1945 he developed a patent for his invention. After that, Spencer had acquired 150 patents as per the reports of the National Inventors Hall of Fame. During 1947, one of the restaurants in Boston tested the commercial microwave oven. With time, the popularity grew, and today many food establishments make use of the indispensable microwave ovens for different functions.

Read Also: The Limitless Ways To Expand Online Shopping Extravagance