Ariana's Porftfolio

  • About Me
  • Reading Log
  • ELA Blog
  • Symposium 2018
  • Writing Portfolio
  • About Me
  • Reading Log
  • ELA Blog
  • Symposium 2018
  • Writing Portfolio

symposium 2018

Picture
Picture

Conflict is composed of opposing forces

The two sides of biased algorithms are the biased algorithms themselves and the other side is the viewers or users of biased algorithms. The reason these two sides oppose each other is that they can eventually harm our democratic values outlined in freedom of the press. Based on the perspective that people choose to follow, they can then decide what they want to believe. The algorithms are commonly described as the recipes or solutions as to what people search. According to the article, Experts on the Pros and Cons of the Algorithm Age, “Microsoft engineers created a Twitter bot named “Tay” this past spring in an attempt to chat with Millennials by responding to their prompts, but within hours it was spouting racist, sexist, Holocaust-denying tweets based on algorithms that had it “learning” how to respond to others based .on what was tweeted at it.” Proving how it is that the algorithms react to the biased information once it is fed to it. Facebook did the same too, they created a feature of Trending Topics from the search engine itself but they resulted to be biased to please the reader and their opinions. They will know and track your political views, these rules will then grow overtime and keep sending the same type of information toward the user once it figures out your interests.

​

Conflict may be natural or man-made.

​

The conflict of biased algorithms is both natural and man-made. The conflict is natural in the way that the algorithms originated from the idea to help users understand and access information easily. The algorithms are not biased until humans feed information based off of their opinion in them. The MIT Media Lab experts state, “This is even more true in machine learning — machines don’t bring prior experience, contextual beliefs, and all the other things that make it important to meet human learners where they are and provide many paths into content.” These machines are not biased until we teach it what we know and what our views are. Humans are essentially the ones who create the conflict because algorithms are rules that do not respond until a search is inputted, the bias then originated based off of how the searches respond to it. The rules of the algorithms follow that if a liberal idea is searched, a liberal idea will be processed, the same goes with a conservative idea as well.  This is expressed in the student led research because the information that was fed to both computers came out to prove that if the algorithms are fed conservative or liberal information, they will continue to cater that information toward the user.

​

Conflict may be intentional or unintentional
​

The conflict of biased algorithms originated with the intention of them being helpful for users, not to be biased. It is unintentional because the purpose of algorithms in technology is for them to help others access information easily. According to the research, from the article, Why Should we Expect Algorithms to be Biased,  by, Janna Anderson and Lee Raine they state, ”But the danger remains that unrecognized bias, not just in the programming of an algorithm but even in the data flowing into it, could inadvertently turn any program into a discriminator.”The article is essentially suggesting that what is behind the search and what type of view the searcher has is what the algorithm will process. It ends up coming back to the viewer.  The MIT Media Lab states, “The textbook in machine learning is the “training data” that you show to your software to teach it how to make decisions.” If we do as the experts say we can change the way that these patterns are sent out. Search companies like Facebook and Google have created a search bubble, meaning that the engines filter searches that are made to favor liberal or conservative sides. According to staff from Science Magazine, “For example, liberals and conservatives may rarely learn about issues that concern the other side simply because those issues never makes it into their news feeds. Over time, this could cause political polarization, because people are not exposed to topics and ideas from the opposite camp.”. This contributes to the way that the users get caught up in their own thought bubble and are not able to expand their perspectives.



​

Conflict may allow for synthesis and change
​

A change that can happen in this conflict is that the idea of having algorithms build up the feed or searches of a person can create a constantly evolving trendline. This change is a result of seeing how it is the the users respond to the other swayed algorithms that search engines use. If the conflict progresses and reaches other companies it is a negative change because it can limit perspectives in political senses.The trend or pattern that will be displayed is of the algorithms becoming more powerful, and having them make the right move. According to an observation made by Paul Cleverley, he states, “Some content will be promoted and other content marginalized. Search Engine Optimization (SEO) is an iterative process to maintain/improve the search result quality that may see some content rise and others fall as a result of changes.” The content that is not needed for the fitted person is put at the end, which explains how the searches on the various observations differ based off of the different amounts of viewpoints that people can have. These changes are significant because they can shape up the future and sway users to make decisions based on their political views.



​

Conflict is progressive
​

The conflict of biased algorithms is progressive because technology is always evolving. As the number of artificial intelligence increases, the more the bias will develop. According to the ideas of algorithm expert, Ed Finn, in Perspective: Algorithm of the Enlightenment, he states that “As humans, we consistently depend on metaphor to interpret computational results, and we need to understand the stakes and boundary conditions of the metaphors we rely on to interpret models and algorithmic systems.” Only as time changes, the amount of artificial intelligence will increase and the companies will do what is in their power to keep the users. By having these computer patterns that work off of gathered data the technology in the hands of large corporations will be trained to have a better approach to what has been inputted in the pattern. They will essentially contribute to the user’s political views, the search engines create a feed that builds up to a liberal or conservative brain. Since the engine’s main focus is to gather information that will please you it can cause a threat to how democracy works. Professor James Arvanitakis states, “More so, this is adaptive: if they drive something at you and you ignore it, something else is delivered to you until you take the (click) bait. It then knows your triggers and continues to deliver more of the same.” Proving that the trend within algorithms is going to contribute in favor of search engines and become smart enough to know what will catch one’s attention.

​

Language of the Discipline
​

According to MIT professionals, algorithms are patterns or rules that are coded into computers. Algorithmic bias is shaping up people’s perspective and changing the way that others view information skewed towards one point of view.
​
Undermine, defined by Merriam-Webster, to weaken or ruin by degrees. This word is significant because it expresses how it is that biased algorithms can weaken the values given to Americans in the Freedom of the Press.
Conservative beliefs
, as defined by Merriam-Webster, holding to traditional attitudes and values and cautious about change or innovation, typically in relation to politics or religion.

Liberal beliefs
, as defined by Merriam-Webster, open to new behavior or opinions and willing to discard traditional values.








Impact within Perspectives
​

Biased algorithms can cause an impact on people’s perspective and the way that they see the world. Having these patterns build up their world and their beliefs can essentially lead a weakening in democracy. According to Antonis Mavropoulos, Facebook has been linked to fake news in the past. This was a result of all the power that people and computers are granted through the internet. Engadget expresses the impact that these algorithms have in society by writing, “This is the basic problem with AI: Its algorithms are not neutral, and the reason they're biased is that society is biased. "Bias" is simply cultural meaning, and a machine cannot divorce unacceptable social meaning (men with science; women with arts) from acceptable ones (flowers are pleasant; weapons are unpleasant). A prejudiced AI is an AI replicating the world accurately. As people continue to look at only conservative or liberal ideas, not only does that manipulate the algorithms programmed by the search engines but that becomes a part of one’s reality.

​

Student Led Research
​

I decided to do an observation on search engines like, Facebook and Google to find out if a relationship between one’s political views and the information that is catered toward them exists. To find a relationship between the algorithms and the viewer’s opinion, I decided to track the path of two people, with completely different beliefs and compare how the algorithms of different computers with different IP addresses respond to the same topic.
Day 1: I made two different social media accounts, basing one off of conservative ideas and the other off of liberal ideas. Since these ideals are very different, it was quite noticeable to see how the results were different from the computers.
Day 2: As the feed in Facebook continues to grow and favor what posts I was liking in the conservative computer and what I was liking in the liberal computer, I found out that both computers were providing information that agrees with the ideology of conservative and liberal beliefs.  
Day 3: As I kept joining organizations like, NRA (National Rifle Association of America) and Anti-Gay sites for the conservative computer the more related topics and news articles were shown. On the Liberal computer, I was searching Gay Rights and I was following Anti- Trump pages, so my feed was very different.
As the days went on, I continued to receive Feminism information and posts that are often linked to favor liberal Ideas. The same happened with with the information that I gathered in the conservative computer. I found out that one’s political views do have an impact and have power over the information presented to the viewers.
Having one profile completely created a one-sided outlook on the trending news, current events, and the world. Algorithms are seen as problematic since most most people will not open different accounts to get a full view on issues. The average user will be setting up one account and only be able to access one side of the world. The algorithms were set to make the choices and beliefs made to favor the user. Instead of a space to find information, pre-made beliefs are given to viewers. It is different when you have choices, then you can see them and choose your viewpoint. However, when you don't have choices, the power of the algorithm becomes instructive and feeds a mindset of bias to the viewer.

​

PArallels

Biased algorithms is a relevant conflict in the 20th century because the generation gathers knowledge based off of technology. According to staff from Science magazine, “For example, liberals and conservatives may rarely learn about issues that concern the other side simply because those issues never makes it into their news feeds. Over time, this could cause political polarization, because people are not exposed to topics and ideas from the opposite camp.”. Experts were able to prove that some news sources deliver stories that are liberal and others deliver more conservative beliefs. Science Magazine found, “For example, the political alignment score of the average story in The New York Times came in at –0.62 (somewhat liberal) whereas the average Fox News story was +0.78 (somewhat conservative).”. The news sources that we prefer are the ones that are tracked and they are how the algorithms make up the patterns that provide information. If we keep looking in to only one side of an argument the future will be based and divided in what one searches in the computer. It is relevant to our world because it can cause to a division of political standings also known as political polarization. Although political polarization is welcome in a true democracy, it can also create disfuction if there is too much of it. Algorithms can be a conflict to our community because it can limit the perspective of the users and sway them to keep believing one side of the story or issue. It can result harmful to the community because it contradicts what the Constitution set up for us. By having these patterns filter information, they are essentially originating the users to believe in a side and only one side, resulting to benefits for the Search engine companies. Biased algorithms are relevant to me because as technology advances, it will have more control over what it is we choose to believe in. It is important to make sure algorithms do not take over our opinions and limit us as well. It is increasingly more important for single users to be aware of the biases that lurk between the algorithms and seek out multiple sources. This is essential if they want to be able to understand the whole picture.



Reflection #1

In the topic of biased algorithms, I found it to be fairly difficult to find an angle to it that is very specific, so I took the angle on how it undermines values granted to us in the Constitution of the United States. According to the MIT Review in, “Biased Algorithms Everywhere and No One Seems to Care”, experts state that the probabilities that biased within news sources in technology are biased. This can fit into the generalization, Conflict can be made of opposing forces, because the algorithms themselves are opposing on the freedom of the press and impacting to democracy and one’s perspective. This means that we have been presented by biased information used to guide or mentality and have it continue thinking one way. Barry Devlin in, “Algorithms or Democracy- You Choose” he talks about the dangers that bias within algorithms have on democracy and your political views. This fits in to the generalization, Conflict may be intentional or unintentional because it is obvious that the algorithms were meant to be used to help people and have an easy way to access information online, but by having the recommendation section, that changes the purpose of them.“Why We Should Expect Algorithms to Be Biased” by Nanette Byrnes  she expresses her findings on how it is possible that liberal bias may exist in social media places, like, Facebook, so the CEO reported that they would change algorithm patterns. The writer of “Perspective: Algorithm of the Enlightenment, Ed Finn compares the ideology of Algorithms and since when they had originated. Philosophers one in particular, Leibniz, had an idea to create codes (algorithms) and use them as a way of living. In the article, “Living in the Age of Algorithms”, the power of these codes is dissected and made to let us (the viewer) know how much influence and information one can get at the touch of a button. In this case the conflict was man-made, because the problem is the algorithms and those were man-made.According to Lee Raine and Janna Anderson in, “Code-Dependent:Pros and Cons of the Algorithm age”, the article talks about the ways on how algorithms work a con presented in them is that it limits the viewer’s perspective a pro is that it makes it easier to access information. This article gives information for the generalization that conflict may be intentional or unintentional because it shows how the algorithms can help and why they were created which creates the pro and the algorithm failed so, that made it unintentional.  I hope to find more information on specifically how the algorithms have a big impact on the democracy and what can happen to the democracy due to the algorithms. I hope to conduct a survey/experiment to help me find the association between age and how algorithms work.

​

Reflection #3

I have added new information to my ISD since I did the last reflection. I found out that since my topic was so recent it was going to make finding information through out time more difficult. This limited my options on being able to use the icon over time. To make sure I can use the icon I found out that instead of having it interpret in the way I want to deliver the information. For my student led research I will be conducting an observation. With my observation I hope to find out how it is that the algorithms respond to a liberal or conservative idea that will be fed to it. On one computer I will focus on liberal ideals and on the other I will focus on conservative ideas, throughout the process of a week, I will be able to find out how it is tha the different algorithms respond to the exact same topic. The difficult part has been able to find a perfect idea and process to fully describe my project. Since I am working by myself on this project it has been easier for me to be able and have control of the project and make sure not to procrastinate. I will need to stay on top of things so I won’t miss due dates and most importantly, so I won’t lose control over my project. One thing I should change is probably looking at the sentence starters and resources that are provided because they really do help.


​

Reflection #4

I had some difficulties in having to make sure that my research was aligned to my driving question so I had to make sure to find the correct information. As I kept looking for research, I was not satisfied in the way that I had found my research. Due to this problem, I knew I had to change something, so I decided it was going to be better if I changed my Driving Question. I learned that some places as to where we get our information are more liberal than the others, others are more conservative. I learned that I actually am capable of working on big projects on my own. I learned a big lesson on time management and how to deal with deadlines. I know that I should have assignments due before hand. To really get the message across of how it is that algorithms do sway you to conservative or liberal beliefs. So I decided to paint and decorate my poster in a way that express that they divide into either liberal or conservative. I did this by having the microchips be divided and made one side blue and the other red. In the following weeks I would like to fill my board more and have more of the Wow! Component. I feel that my time management is going good so far, but I am having trouble finding a way to phrase my information. I feel that I need to ask for help on that.I am planning to make a collaborative piece with two other groups. We are planning to represent America as being weakened by forces that could be a threat to it. In my case we are using how algorithms can cause a threat to America so we are using a thin stick to represent how America is standing in a weakened Democracy (stick).



​
Powered by Create your own unique website with customizable templates.