Friday, November 25, 2016

Implications of an Improved Lip-Reading Software




Researchers at Oxford University have recently developed a new program called LipNet which uses video frames to put together different movements of the mouth to make sentences which it then outputs as words. While this seems like a fairly simple way of doing lip reading, it has actually been amazingly successful. This new system of reading lips has reached an astonishing 93% accuracy
which trumps the previous best computer rating of 79% and smashes the human expert level of 52%.

The importance of this new software has implications way beyond just another creative example of a computer taking an input and returning an output. The social opportunities are massive for this type of software. From a human welfare perspective, over 360 million people worldwide struggle with hearing problems and a program such as this could drastically improve their lives. Additionally, commercially this software could be incredibly helpful anywhere from reading opposing teams lips in sports, reading the lips of people speaking far away without a microphone, or reading lips in a noisy environment. There could also be national defense implications with being able to read peoples lips through video footage without sound or by spying on people's conversations from a distance with binoculars. Regardless of how this could eventually be used, the possibilities are endless.

The biggest concern however is privacy. We live in a world where almost everything you do is being recorded whether by phones, street cameras, or even satellite surveillance. If this lip-reading software is used by the wrong groups it could be a massive breach of privacy to people whose conversations could be intruded on by outside sources. This will most likely become one of the developing hot topics, like drone use and cybersecurity, that will require new laws and regulations to keep under control.


Resources:
1. http://www.digitaltrends.com/cool-tech/lipreading-artificial-intelligence/
2. http://www.signlanguagenyc.com/wp-content/uploads/2016/03/lip-reading-asl-cart-services-nyc-02-219x300.jpg
3. http://www.notbored.org/watching-all-day.jpg
4. http://wwwhatsnew.com/wp-content/uploads/2016/11/LipNet-730x480.jpg

Saturday, November 12, 2016

MogIA Accurately Predicts Election


In light of current events in the United States, I thought that I would take this week to write about what is on everybody's minds-- the Election. I don't want to get political in what is a very sensitive time so I am going to try and be as unbiased as possible throughout this post.

Leading up to the election Hillary Clinton seemed poised to win the Presidency according to many polls and news sources. However, to the surprise of many, Donald Trump was able to win several key swing states in order to become the next President of the United States. Who could have ever predicted this? Apparently an AI system called MogIA.

Business Insider posted an article leading up to the election about MogIA and how it has successfully predicted the the last three Presidential elections as well as both of this year's primaries. How is this possible you may ask? MogIA uses data from sources such as Google, YouTube, and Twitter to analyze how well a candidate is doing. It then updates itself over time in order time by learning from the online environment.

While MogIA is definitely not perfect it has yet to be wrong when predicting presidential elections. MogIA is an example of yet another incredible algorithm that can analyze an event and both unbiasedly and accurately predict outcomes better than humans.

Sources:
1. http://www.businessinsider.com.au/artificial-intelligence-trump-win-2016-10
2. http://zdnet1.cbsistatic.com/hub/i/r/2016/11/09/25fd2587-0c5c-4afc-bcdf-96539d83d420/thumbnail/770x578/feb141cb6013a9120dad232d0b7d513b/trump-cyber-crisis.jpg
3. http://realiran.org/wp-content/uploads/2016/08/US-elections-2016-638871.jpg

Friday, November 4, 2016

Sorting Data

Have you ever wondered what the most efficient way to sort data is? Say you have 10 randomly generated numbers and want to sort them from highest to lowest. I think we could all figure out a way to do that in java without it taking too long but it becomes an issue when dealing with big data. As a result there are over 15 different methods for sorting data. This video below of data being sorted is actually oddly satisfying.


Additionally a new sorting algorithm was created to solve the "cake-cutting" problem. This is a problem in which you are determining how to cut a cake fairly amongst multiple people given their different preferences. Think about a holiday party with a fruit cake. Every member of the family has a different preference for how big of a slice and which fruits they want in their slice. This problem tackles the problem of fairness which was previously viewed as unsolvable mathematically. However, recently a researcher was able to solve this problem for a group of anywhere between 3 to 203 people. 



The most notable part of this solution is that it shows that previously "unsolvable" problems can actually be solvable. Mathematicians should now be inspired to give those impossible problems a look again. Also, if you haven't already, please watch the data sorting video above it's actually really cool.  


Resources:
1. http://www.digitaltrends.com/cool-tech/sorting-algorithms-video/
2. http://www.digitaltrends.com/computing/cake-cutting-algorithm/
3. http://d2gk7xgygi98cy.cloudfront.net/1296-3-large.jpg

Friday, October 28, 2016

"Hive Minds" Accurately Predict Results


Recently the concept of using "Hive Minds" to predict the results of events has been gaining steam. Hive Minds is the concept that looking at the decisions of individuals in a group can predict what will most likely happen. The idea behind Hive Minds, also known as Swarm Intelligence, stems from how bees, ants, and fish act in groups to make collective decisions. While the concept is not too far-fetched and seems like it would be very similar to a poll or debate it actually gets substantially better results.



For example, the Hive Minds program UNU has was recently able to predict the top 4 horses in order at the Kentucky Derby despite 540-1 odds. While its results may not always be perfect, they often are or are pretty close. Below I have showed two more examples of Hive Mind decisions:




The UNU program is a new and incredibly interesting system which uses a unique algorithm to take user opinions, input, to determine what will most likely occur, output. While it may be impossible to ever create a fully infallible program, this is an intriguing new way to make decisions.

Resources:

  1. http://unanimous.ai/human-swarms/
  2. http://www.iflscience.com/technology/hive-mind-artificial-intelligence-wins-almost-11000-horse-bets/

Friday, October 14, 2016

Computer Science as a Way to Fight Cancer?




Microsoft has recently stated working closely with biology researchers to find a way to combine both computer science and biology to fight cancer. On the surface it may seem like a crazy idea. How can computer science fight cancer, a disease which occurs in the human body? Interestingly enough it actually makes a lot of sense. Microsoft is essentially working on developing a "molecular computer" which will fit inside cells and allow for them to monitor for diseases and prevent them from spreading.

Is This something which will happen tomorrow? No, most likely not. Microsoft is giving itself a 10 year timetable in which to develop this concept and do testing. However, the new look into how to preemptively prevent cancer holds serious implications. If this could be used for cancer, it could theoretically be applied to other diseases as well. If this idea works, which is definitely a big if, it could potentially eliminate many of the serious diseases that kill millions every year. It would essentially be a super vaccine that would be a turning point in human history.


The real questions comes down to if it is really possible to program a human cell like a computer is programmed. Can that same process of scanning for and then eliminating problems that computers do actually be applied to the human body? Only time can tell but it is certainly something to keep an eye out for.





Resources:
1) http://www.wired.co.uk/article/microsoft-solve-cancer-computer-science
2)http://futurism.com/wp-content/uploads/2016/09/Microsoft-will-solve-cancer-within-10-years-by-reprogramming-diseased-cells-600x315.jpg
3) http://wallstreetpit.com/wp-content/uploads/news/pbd/health-stem-cell.jpg
4) https://abm-website-assets.s3.amazonaws.com/rdmag.com/s3fs-public/featured_image/2016/06/rd1606_microsoft_@.jpg

Friday, October 7, 2016

Star Citizen- A Computer Game That is Out of This World

Today I will be talking about the long-awaited, and still yet to be fully released, computer game Star Citizen. In January of 2012, the small video game studio Roberts Space Industries posted on Kickstarter the idea for a computer game that would push every boundary previously set in the industry. Star Citizen would be a massive multiplayer online role playing game in which you could fly around in a space ship, get off at different planets, participate in in-game events, and, most importantly explore an entire virtual universe. Due to these lofty goals and fan support, Star Citizen raised over 100 million dollars making it one of the most successful Kickstarter campaigns of all time.

It is the notion of an endless universe that makes Star Citizen stand out. The endless universe mimics the size of our own universe and will be so large that nobody will ever be able to explore it all and the game will essentially be limitless. Every planet will have it's own unique size, shape, weather patterns, biomes, animals, items, etc so that the player will never get bored. The graphics, memory, and hardware required to make a game of this magnitude are massive and attribute to why something like this has never been done before. Despite many hiccups along the way, Star Citizen is aiming to be released by the end of 2016.

In class we have experimented with random generators, talked about memory storage, and started designing programs with images that move. To me, it is absolutely amazing to think about the scale of this game and the skill required in which to produce it. However, it goes to show that, as we have seen throughout the 21st century, with computer anything is possible.

Resources:
  1. http://starcitizen.wikia.com/wiki/Crowdfunding_campaign
  2. http://blogs-images.forbes.com/jasonevangelho/files/2015/10/star-citizen1.jpg
  3. http://blogs-images.forbes.com/erikkain/files/2016/08/star-citizen-header-1200x675.jpg
  4. https://robertsspaceindustries.com/about-the-game/spaceflight

Thursday, September 29, 2016

Google's New AI Kill Switch

Have you ever wondered what would happen if we reached a point in which robots carried out all of the mundane tasks that we don’t want to do? Then what if these robots learned to optimize these tasks and update themselves based on real world experiences? Then what if they stopped updating to do the right thing, but instead to do the wrong thing? What if this led to robots plotting against humans in an attempt to takeover the world? Yes, what I just explained is the plot from “I, Robot” but the general line of thinking is more realistic than you would think.

AI Robots in "I, Robot"

Google researchers have recently developed an AI kill switch in order to shut down an AI robot no matter what. This is incredibly important because AI robots often work by trying to maximize some sort of function or algorithm. They will then read real world events as inputs and adapt those inputs into their systems so that they can continue to maximize their functions. Human interference can be considered one of these real world events. Therefore, if the machine determines that human interaction is detrimental to maximizing their functions or that the human is trying to shut it down, the machine could act out against the human to stop this from happening. Due to this fear, Google created an AI kill switch to automatically shut down a robot at anytime.



However, Google would never want to use the kill switch unless it had to and is instead working on creating a way to have robots read human interactions as part of the task and therefore not as a detrimental interruption. Doing so would essentially allow humans to teach robots things such as to not go outside in the rain. Hopefully, through both teaching robots how to read human interactions and including a kill switch in the machines, we will never have to live through the reality in “I, Robot.”



Resources:
  1. http://motherboard.vice.com/read/google-researchers-have-come-up-with-an-ai-kill-switch
  2. http://www.themanufacturer.com/wp-content/uploads/2015/08/EPA245.jpg
  3. https://www.singularityweblog.com/wp-content/uploads/2015/02/Kill-Switch.jpg
  4. http://images.contentful.com/7h71s48744nc/3l9IwugJKgcCq8YMwWGiy/8fdbc22b56ca7a2d34fb26e76952533e/i-robot.jpg



Thursday, September 22, 2016

Backing up Memory to an External Hard Drive

Earlier this week I knocked a cup of water off my desk and onto my laptop causing my heart to skip a beat and my life to flash before my eyes. As I looked down, I watched prized pictures, videos, documents, and everything else that was on my computer go down the drain. Luckily enough, the water only got onto the top of my computer and everything was fine. However, I realized that I had to back everything up that was stored in the memory of my computer immediately. Of course, as for all things I need, I went onto Amazon and bought an external hard drive immediately.

This hard drive is similar to the one I purchased

After the hard drive came in the mail yesterday, I hooked it up to my computer followed the prompt, and then waited as 320.6GB of my precious information was transferred over. The way that this process worked through a computer science standpoint was that I cleared the memory on the external hard drive so that it was empty. I then connected it to my computer with a USB cable to establish a link between the two devices. The computer then sent signals with information to the hard drive and wrote that data into the memory on the hard drive. After the data was all copied over, I then had a full backup of all of the information on my computer. Talk about a relief!

The new SanDisk 1TB microSD card
This experience related to our class discussion on how memory works. The hard drive is able to use extract the information from the bytes in the computer and copy them into it’s own memory. While the original model for this system, the Turing Machine, was based around an infinite amount of space, the hard drive that I purchased had 1 terabyte of space. While this seems like a lot of space, it is actually not that much when you look at recent developments. Just a few days ago, SanDisk released a design of a 1 terabyte memory card that alone has more storage than my entire computer. Hear that Apple? Please adopt these 1 terabyte memory cards so we no longer have to run out of storage on our phones! As this progress with memory storage continues, there may be some day where we can have a near infinite amount of storage on devices.



References:

Friday, September 9, 2016

Drones-- Computer Science in our Everyday Lives

One of the most prominent and emerging breakthroughs in robotics during the 21st century has been drone technology. Drones, a subset of UAV's (unmanned aerial vehicles), use infra-red cameras, GPS technology, lasers, and state of the art computer software to be piloted from remote locations. Their use has infiltrated many aspects of our lives ranging from recreational, to cinematography, to military.

A military drone from the early 2000's
Drones work via remote control from an operator on the ground. The person controlling the drone uses joysticks and buttons to control the flight pattern, camera, and specific functionalities of the drone. Each one of these operations can be considered an input which the drone's program must decipher and determine an output. For example, if the operator moves the joy stick down and to the left, the drone must receive that signal and then read it to know that it must fly up and to the left. 

How military drones function
In recent years drones have come under fire for ethical reasons. People argue that that drones can be used to violate personal privacy and can also be considered a dangerous advancement in warfare. However, despite these concerns, drones have continued to be refined are are becoming more and more prominent in the commercial marketplace. While only time can tell the future of drones, it is safe to say that they are a prominent example of the influence of computer science in our lives.

Resources:
Pictures
  • http://ichef.bbci.co.uk/news/624/media/images/48461000/gif/_48461757_how_drones_work_464.gif
  • http://www.trbimg.com/img-5244b88a/turbine/la-na-nn-fbi-using-drones-2006-20130926
Resources
  • https://www.dronezon.com/learn-about-drones-quadcopters/what-is-drone-technology-or-how-does-drone-technology-work/
  • http://www.telegraph.co.uk/technology/2016/04/18/drones-are-not-toys--theyre-dangerous-and-they-must-be-regulated/



Bitcoins- a complex system broken down



To begin this blog I want to clarify one thing: Bitcoins are confusing. Bitcoins are an online currency that are used for untraced, non-refundable online transactions. They operate without any central authority and are protected through cryptography. Bitcoin's value stems from consumer confidence meaning people are willing to either accept bitcoins or buy them based on what they perceive they can then sell them for in return. As of September 8th, the digital currency reached a value of $630.
The value of Bitcoin over the past 5 years.
Being a digital currency that facilitates online transactions and its protection comes from cyber cryptography, Bitcoin is inherently computer science related. However, they relate most to our in-class discussions through the way that they are "mined". Bitcoins regulate inflation through a process called "mining". This is when Bitcoin users with powerful enough computers use a special software to solve complex math problems and receive Bitcoins in return. Weird right? However, these math problems help Bitcoin's operating processes and cryptography and therefore help ensure the security of the currency. Since these problems require powerful computers and therefore would be too much for Bitcoin to manage on its own, the responsibility is passed off to its users. 

Two computers set up for Bitcoin mining

These programs and the problems that they solve are what relate to our class. They use a program with takes an input: the math problem, runs an algorithm: the software, and then provides an output: the solution/final product. This output is then rewarded with Bitcoins with then can be used for online transactions. This is a clear example of an industry which relies on computer science fundamentals in order to survive and prosper. 

References:
Pictures
  • https://46qasb3uw5yn639ko4bz2ptr8u-wpengine.netdna-ssl.com/files/2014/01/mining-rig.jpeg
  • http://mfi-miami.com/wp-content/uploads/2015/10/bitcoin.png
  • http://www.coindesk.com/bitcoin-price-flirts-630-traders-bet-long/
Content
  • http://www.coindesk.com/bitcoin-price-flirts-630-traders-bet-long/
  • https://en.bitcoin.it/wiki/Main_Page
  • https://www.bitcoinmining.com/





Thursday, September 1, 2016

'The Little Prince"- CG Animation vs. Paper Cutout Animation

CG on the left and Paper Cut-Out on the Right


On August 5, 2016 Netflix released their original film "The Little Prince". Adapted from Antoine de Saint-Exupéry’s popular 1943 novella “The Little Prince”, this film received rave reviews with 93% on Rotten Tomatoes and 7.8/10 IMDB. The reason this film received such critical acclaim was it's seamless integration of CG Animation and Paper Cutout Animation into the same film.


CG Animation is the style of animation which has made large studios such as Pixar and DreamWorks so famous. This style requires painstaking effort in order to animate just one frame, to which there are about 24 per second, within the film. The level of detail that goes into the background, characters, dialogue, etc. accounts for how films such as Disney's "Frozen" takes so long to make. For example, an article on Bennet.com stated how for one scene alone, due to "its complexity, the scene took 4,000 computers over 30 hours to render each and every frame."


CG Animation











Paper Cut-Out Animation


Paper Cutout Animation is instead actually filmed using paper figures and stop-motion animation. these scenes require physically creating a set compared to digitally designing one. Paper Cutout Animation is incredibly difficult in it's own right since if one part of a scene is decide to be changed, the entire scene must be changed to account for that.

'The Little Prince" uses this disparity to it's advantage by creating two different "worlds" within the film-- one in CG and one with paper cut-outs. However, if it weren't for computers, we would not be able to do CG animation and instead this film would be one dimensional. As a result, "The Little Prince" serves as a perfect example for how computers and CG Animation open up a whole new world of opportunities in the film industry.

Tuesday, August 30, 2016

Virtual Reality for Dummies

Hi everyone! Today I decided to focus on what I view as one of the coolest new technological trends with endless possibilities- Virtual and Augmented reality.

Virtual reality (VR) is when computers are used to create essentially a virtual reality, or a whole new world to be engulfed in. Devices such as the Oculus Rift do this by placing a dynamic screen in front of your eyes that dominates your sense of visual perception and moves based on how you move your head.


Augmented reality (AR) is when computers are used to enhance the real world surrounding the user. An example of this would be something like Pokémon Go in which Pokemon are overlaid on the current world.




The most exciting thing about VR and AR though is how their possibilities are endless. An example of that is with the newest application of potential VR/AR computers. Essentially, these computers would have both virtual keyboards, virtual mouses, and the capability for voice recognition software. Does this sound too futuristic and unrealistic? Well in some ways yes it is. This is still just a concept and a full prototype will take years to develop and will face push back from computing traditionalists. However, these are the types of things that are not possible following the recent strides in VR and AR technologies.

 I will be making my next post soon and remember-- If I can do this than you can too!

References:
Picture 1) http://time.com/4471620/virtual-reality-computers/
Picture 2) https://cnet4.cbsistatic.com/img/iNSgKL0u3mNnlMC9eyHbbgGUKMA=/fit-in/270x0/2016/07/08/a82975f5-6adb-4dec-8bec-561ca3d348ea/pokemon-go-gif.gif
Article 1) http://time.com/4471620/virtual-reality-computers/

Introduction to "Computer Science for Dummies"



Welcome everyone to my new blog "Computer Science for Dummies"! I have recently started taking an introductory computer science class at the University of Richmond taught by professor Jory Denny. As part of a requirement for the class, I will be posting twice a week about interesting new and fun topics in the computer science field that I have discovered.

Why "Computer Science for Dummies"? Because I am a computer science dummy! My goal is that if I can understand these topics than anyone can. I will aim to break down these computer science trends into understandable and relatable topics that anybody can read.


I will be making my first post soon and remember-- If I can do this than you can too!

References:
Picture 1) https://i.ytimg.com/vi/Fl8L3vQf5vE/maxresdefault.jpg