PlatinumEssays.com - Free Essays, Term Papers, Research Papers and Book Reports
Search

Algorithms and the Various Types of Opacities

By:   •  April 22, 2019  •  Essay  •  473 Words (2 Pages)  •  869 Views

Page 1 of 2

Vaibhav Bhatia |VB

The article How the machine ‘thinks’: Understanding opacity in machine learning algorithms aims at providing an in insight to the working of algorithms and the various types of opacities that come with it. Moreover, the article goes deeper into how these opacities play a role in our everyday life in something as simple as spam filtering to Neural network formations and image recognition. Machine learning algorithms are the main focus of Burrell’s article. An algorithm can be extremely useful in terms of the amount of work that it can and hence reducing the human effort, one of the things that an algorithm can do is classify emails as spam and non-spam. Credit scoring algorithms are another example. The problem of opacity arises when we don’t know what is going on the black box.

[pic 1]

Three Types of Opacities

  1. Intentional Opacity: Companies and Organizations take conscious efforts to protect their algorithms, essentially to protect it from being copied by a competitor or being breached by a hacker. This type of opacity might seem necessary at times and it intentionally prevents end users from figuring out the internal working of the algorithms.
  2. Illiterate Opacity:  This form of opacity is guided by the absence of technical know-how and can often arise if we as an organization use third party product and services. Training people inhouse, learning the core working of the algorithm are the only way to resolve this kind of opacity.
  3. Intrinsic Opacity:  This is caused by a mismatch in the way human mind and the algorithms perceive data. Complexity of the model is a key factor in determining the level of Intrinsic opacity that the algorithm might contain. The process of training the data must be clear in the sense that the code is clear and legible. The data set is well understood by the people working on it. I personally believe it is the toughest type of opacity to get rid of.

 In the example of spam filtering in Nigeria, the set of words selected are region specific and cater to a specific demographic. The algorithm depends greatly on the data that it is trained on. The ambiguity in the choosing the type of data and the processes that is trained on creates a level of mistrust among people who are affected by the algorithm. While using a service the customer has no control over the algorithm that one’s data is subjected to. As data literacy increases more and more end users would want to gain more control over the nature of algorithms that they are subjected to. From a legal point of view fair usage of data can only attained through a more transparent system. It will put the corporations under greater scrutiny to not exploit the consumer data and still leverage it to provide better services.

...

Download:  txt (2.7 Kb)   pdf (45.3 Kb)   docx (12.7 Kb)  
Continue for 1 more page »