Artificial Intelligence has grown to be one of the most popular areas in Computer Science recently. Researchers have been contributing days and years on inventing and researching algorithms to make machines to learn all kinds of things. I have to say that researchers have made a lot of progress since AIs are blending into people’s daily life. Just two weeks ago, many people were following the greatest Go battle between human and machine. The machine that was involve was called AlphaGo. AlphaGo is a computer program developed by Google DeepMind in London to play the board game Go. It beat Lee Sedol in a five-game match. It was the fist time a computer Go program has beaten a 9-dan professional without handicaps. People were impressed and shocked by how smart a computer program was. It is scary to see machines are becoming more intelligent than human in some fields.
Although some machines have relatively high IQ, do they have high EQ as well? Can machines adopt some kinds of algorithms to read people’s emotions? Can machines really understand what people are talking about semantically? Over the last two weeks, I have been using Topic Modeling tool Mallet and Sentiment Analysis tool Alchemy API to analyze my corpus. I was looking forward to finding out how well can a machine understand my corpus.
I first used Mallet to generate a list of topics contained in the text. Here is the result from output_html:
For some reason, I was unable to navigate to specific chunks of text that generate these topics. Then I tried command-line Mallet to see if there’s any luck. The results I got:
Although the topics can be shown, I was still unable to navigate to the text. Looking at the topics generated by Mallet, I was not very satisfied with its understanding of my corpus. I would not consider these words as “topics” since they are not general. Also, I think these words are only key ingredients in all the cookbooks in my corpus.
Moving onto using machines to do sentiment analysis, I selected one recipe from the Chinese-Japanese Cookbook. I uploaded the selected recipe to Alchemy. Here is the result I obtained:
It is interesting to see that some ingredients in this recipe are considered to have positive and negative sentiment. For example, “cold water” indicates a negative sentiment. I think it is reasonable to say so. Cold water could indeed bring pain to people. However, according to contexts, cold water does not necessarily express a negative sentiment in a recipe.
I also made Alchemy to read this recipe’s emotions:
I was surprised to see a recipe can express these different emotions. I figured Alchemy does all the analysis based on certain keywords’ apparent meaning instead of their meanings in the context.
Nevertheless, I should not totally blame on machine not being able to understand me. Even people disagree with each other sometimes. Like the example we did in class, people had different opinions on what words should be considered related to “war” or “nation”. It is acceptable for machines to misunderstand us.
Ramsay said ” What is different about digital archives is the way in which text analytical procedures (including that most primitive of procedures: the keyword search) has the potential to draw unexpected paths through a documentary space that is distinguished by its overall incomprehensibility. ” in his article “patacomputing”. I think he’s trying to tell us not try to fully rely on machines to do textual analysis. However, treat machines as useful tool to assist us. Machines can sometimes discover connections within text that cannot be found by human.
Ramsay also claimed that “It is manifestly impossible to read everything, and it has always been so. ” Yet digital tools “are capable of presenting the bare, trivial truths of textuality in a way that allows connection with other narratives-in particular, those narratives that seek to install the text into a network of critical activity.” Therefore, tools like Alchemy and Mallet have big potentials that await me to discover.
Katie Faull says
Excellent observations! Maybe cookbooks are not supposed to have emotions? As they are a set of instructions, they should be objective.