Writing for The Conversation, Paul Ibbotson contends that language acquisition in infants comes from a “general psychological toolkit”, including abilities such as memory, attention and social-cognition. In doing so, he argues against linguist Noam Chomsky’s theory of an innate ability to develop language existing within the infant from birth, independent of other development. Chomsky’s theory is known as ‘universal grammar’.
Ibbotson argues that Chomsky’s universal grammar theory “vastly underestimates both the breadth and depth with which cognition interacts with, constrains and predicts language use”. He uses the example of memory to show how language development interacts with other cognitive abilities. As memory develops, the grammatical options available to the infant increase, from “boy sees girl” when memory is very limited, to “boys who chase dogs see girls” as the memory capacity increases to include a wider range of language and grammar.
Ibbotson concludes by stating that language acquisition in infants is special because it may be “nature’s finest example of cognitive recycling and reuse”.
Read the full article: Paul Ibbotson, ‘The key to language is universal psychology, not universal grammar’, The Conversation, 21 August 2020.
Algorithms in the Workplace
Writing for the University of Oxford’s Science Blog, Professor Jeremias Adams-Prassl observes the rise and influence of algorithms in different spheres of life, and asks: “what if your boss was an algorithm?” He notes that, while management automation has been used in warehouse and delivery settings for some time, the same technology is increasingly being introduced across the spectrum of workplaces, from “hospitals and law firms to banks and even universities”.
For Adams–Prassl, this may have benefits in certain workplaces. Algorithms can be used to help plan careers and find new opportunities in large organisations. They can also catch insider trading. However, he also urges caution, and states that there are significant risks to trusting management decisions to machines. These include interviewing software discriminating against applicants based on skin tone, and machines learning to discriminate based on a large number of CVs from a particular societal group. Adams–Prassl also warns that the legal framework around automated management has not kept pace with technological developments.
The article concludes by warning against “AI for AI’s sake”, and states that AI’s tendency to “punish outliers” means that work needs to be done to improve the protective legal framework around automated management to ensure that minority groups are not discriminated against.
Read the full article: Jeremias Adams-Prassl, ‘Sacked by an algorithm: managing the future’, Oxford Science Blog, 3 September 2020.