obsidian


Mentalistic Understanding of Transformations

Utku Turk

08 Feb 2022

Mentalistic Understanding of Transformations

[[Classnotes MOC]] [[Psycholing Classnotes]] [[Learning Wh-Gaps, Utku Turk]]

Date:: 2022-02-08

Class:: Psycholinguistics II

Tags:: #umd #classnotes #umd/psylx

∫Miller62: Transformational Cube.

What are the mentalistic claims about transformations?

Let’s start with effects at verb position. English fronts wh-words. How is the wh-word is linked to the verb?

What if we try to test steps of derivations? But it is difficult to directly use grammatical rules.

Basic Parsing Systems

\(\cdot \) represents when the rule formation

Bottom up = Det N \(\cdot \)

Left corner = Det \(\cdot \) N

Top down = \(\cdot \) Det N

Arnold read some books

Bottom up after reading this NP(Arnold) V(read), nothing to do

NP(Arnold) V(read) Det(some), still nothing.

NP(Arnold) V(read) Det(some) N(Books), - Now we form NP(Det N) > VP(V NP) > S(NP VP)

Strict bottom up parsing does not explain incrementality

Left Corner

NP(Arnold) > we go up See S, and form S(NP \(_ \) )

Then we see read we can form S(NP VP(V \(_ \) )

Then we see some, we can form S(NP VP(V NP(Det \(_ \) )

Finally we see books, now we do have the full thing. S(NP VP(V NP(Det N)))

Left corner parser allows us to hypothesize orphan structures, it also allow us to entertain multiple structures. Bottom up parsing by definition do not do this.

Top down

If we take transformations literally, it is going to be really hard to use it as a identification device.

a = X\(_1\) wh-NP X\(_2\)

b = wh-NP X\(_1\) 0 X\(_2\)

it is easy to derive b from a, but really hard to derive a from b.

Surface Structures

People heard these sentences, and a click with com. People misreport when they heard the click. In the first sentence, click moves back to the sentence boundary. In the second one, it moves forward, at the end of word company.

People were illusioning where they hear the click, and these would align with hiden structures.

Deep Structures

Cued recall (Wanner, 1968)

Cue: detective. More effective for cease than for prevent.

Sachs (1967, 1974): memory for content, not structure. Structures persist, as well as meaning, but they are independent of each other

Bransford & Franks (1971)

Highest recognition confidence intervals for full sentence never seen before!

Bransford, Barclay, Franks 1972

Takeaway

Surface structure yep

deep? maybe…

transformations??? NOPE!

the idea that “Backward” building (from b to a) is not very logical was more logical.

Cross modal priming

could you see evidence this mediated encoding, that is transformations?

they decide on “boy” faster in positions after from or meet than they decide for “girl”

Pickering & Barry 1991

Timing evidence as theoretical arbitration works when theories make timing predictions.

Relation structures and time in linguistics and psycholinguistics, Phillips & Wagers 2007

not so obvious variation

While John was reading the book, he ate an apple

while he was reading the book, John ate an apple. russian not ok, english ok

John ate an apple while he was reading the book.

He ate an apple while John was reading the book. impossible in all languages

example 2 constratin onquestions

what do you think sally ate?

what do you think that sally ate?

who do you think ate the donut?

who do you think that ate the donut? not ok in English, ok in Italian

variation might be linked to the possibility of post-verbal subjects.

it is not about the corpus, because the second question is also less heard.

Languages that allow post-verbal subjects also allow the red example

Post-verbal subjects are easy for learners to observe.