Prof. Parker will be heading to UC Davis March 14-17 to give a talk at the 31st Annual CUNY Human Sentence Processing Conference. His talk is titled “A multi-dimensional view of NPI licensing”, and will present new research showing that real-time Negative Polarity Item (NPI) licensing is driven by semantic and pragmatic conditions, and must involve more than a simple syntactic feature matching process as previously assumed.
Prof. Parker was recently awarded a William & Mary Scholars Undergraduate Research Experience (WMSURE) Fellowship funded by the Andrew W. Mellon Foundation to improve interdisciplinary education at the College. As a WMSURE/Mellon Fellow, Prof. Parker will work to increase the opportunities for students from first generation and lower income families, and historically under-represented racial and ethnic groups, to participate in research under faculty supervision.
Congratulations to Prof. Anya Lunden who recently published a series of papers appearing in Phonology, Laboratory Phonology, and The Linguistic Review!
The paper in Phonology was co-authored with current W&M student Jessica Campbell and recent graduate Mark Hutchins, along with Nick Kalivoda (UCSC). The paper in The Linguistic Review was co-authored with graduate Kelsey Renoll.
Lunden, Anya, Jessica Campbell, Mark Hutchens, and Nick Kalivoda. 2017. Vowel-length contrasts and phonetic cues to stress: an investigation of their relation. Phonology 34:565-580.
Lunden, Anya. 2017. Duration, vowel quality, and the rhythmic pattern of English. Laboratory Phonology 8: 1–20.
Lunden, Anya and Kelsey Renoll. Position and stress as factors in long distance metathesis. The Linguistic Review 34(4): 615–634.
Prof. Dan Parker published a new article in Journal of Memory and Language. The paper “Processing multiple gap dependencies: Forewarned is forearmed” investigates the processing of sentences with across-the-board (ATB) extraction to better understand the mechanisms of syntactic prediction in sentence comprehension.
Parker, D. 2017. Processing multiple gap dependencies: Forewarned is forearmed. Journal of Memory and Language, 97, 175-186. [pdf]
We accomplished a lot in the Spring 2017 semester. Let’s take stock:
- 2 Honors theses from Quentin Ullrich (chair: Prof. Parker) and Alexa Rosalsky (chair: Prof. Lunden)
- 2 new honors projects approved for the 2017-2018 academic year, from Jessica Campbell and Joshua Greenfield.
- We once again hosted the North American Computational Linguistics Olympiad (NACLO)
- Prof. Lunden gave 2 talks at the 2017 Linguistic Society of America (LSA) Annual meeting.
- Prof. Lunden published an article in Phonology, co-authored with W&M students Jessica Campbell and Mark Hutchins (along with Nick Kalivoda).
- Prof. Parker published 3 articles . One article appeared in Trends in Cognitive Science (co-authored with recent alumnus, Daniel Lantz) one in Journal of Memory and Language (co-authored with Colin Phillips), and one in an edited volume on Language Processing and Disorders (co-authored with Mike Shvartsman and Julie Van Dyke)
- Prof. Harrigan successfully connected with area schools to conduct acquisition research with W&M students.
- Prof. Parker gave 3 presentations (2 posters and 1 talk) at CUNY 2017
- A multidisciplinary team of faculty including Prof. Parker established the new Data Science Program at W&M, which students can pursue a degree in starting this Fall
- Prof. Parker obtained a Faculty Summer Research grant to support research in the lab this summer
- And we churned through a series of exciting lab meetings from students and faculty!
Whew! Certainly looking forward to Summer!
Prof. Parker will be heading to MIT this March to give three presentations at CUNY 2017:
Schlueter, Z., Parker, D., & Lau, E. (Mis)interpreting agreement attraction: Evidence from a novel dual-task paradigm. Talk at the 30th CUNY Conference on Human Sentence Processing. MIT.
Parker, D. Memory retrieval in sentence comprehension uses a non-linear cue combination rule. Poster at the 30th CUNY Conference on Human Sentence Processing. MIT.
Parker, D. Selective agreement attraction effects: Not all phrases are equally attractive. Poster at the 30th CUNY Conference on Human Sentence Processing. MIT.
Prof. Dan Parker and Colin Phillips (UMD) just published new work in Journal of Memory and Language. Their paper, “Reflexive attraction is selective” shows how to systematically induce attraction effects for reflexive anaphors using eye-tracking. Check it out!
Parker, D. & Phillips, C. (2017). Reflexive attraction in comprehension is selective. Journal of Memory and Language, 94, 272-290. [pdf]
William & Mary once again served as a site host for the North American Computational Linguistics Olympiad (NACLO). The event is an annual competition that engages high schoolers in language and logic problems. This year, we had around 10 students from local schools participate. Many thanks to the student volunteers, Jessica Campbell and Colin Wilson for their help promoting the event.
Prof. Dan Parker and recent graduate, Daniel Lantz (’16), just published their work “Encoding and Accessing Linguistics Representations in a Dynamically Structured Holographic Memory System” in the journal Topics in Cognitive Science (topiCS). This paper is an extended version of their earlier 2016 paper of the same title that was published in the proceedings of the 2016 International Conference on Cognitive Modeling (ICCM)
Parker, D. & Lantz, D. (accepted). Encoding and Accessing Linguistic Representations in a Dynamically Structured Holographic Memory System. Topics in Cognitive Science, 9, 51-68. [pdf; supersedes the 2016 ICCM paper of the same title]
Prof. Dan Parker and Colin Phillips (U. of Maryland) just published their work on illusory negative polarity item (NPI) licensing in Cognition! In this paper, Parker and Phillips use standard psycholinguistic methodologies to show that by making minimal changes to a sentence, it is possible to selectively control the presence and absence of linguistic illusions involving NPIs. These findings turn out to be very informative about how linguistic structure is encoded in working memory. Check out the paper: