Archive for the ‘Cognitive Science’ Category

Distinguishing Imagery and Perception

Wednesday, December 17th, 2008
bluebanana.jpg


Imagine these bananas

I’m having a hard time remembering a reference for an experiment I recall hearing about. Any help is appreciated.

As I recall the experiment, subjects were asked to look at a visual stimulus on a screen which was presented in variable degrees of faded-ness (with completely faded stimuli disappearing altogether). Subjects were also asked to imagine (form mental images of) the vanished stimuli as still being present on the screen. At some point in the experiment, there was some sort of measure of whether the subjects were seeing an actually present stimulus versus imagining one. I recall there being some result concerning the subjects not being terrific at distinguishing their own imagery from perceptions of the real deal.

Anyone know what I may be remembering? Or am I imagining this?

Cognitive Science Visiting Undergraduate Program

Wednesday, February 20th, 2008

The Cognitive Science Program at Indiana University, Bloomington invites upper-level undergraduate students and students who are graduating from college to apply to the Cognitive Science Visiting Undergraduate Program.
The program is designed to give students interested in Cognitive Science an opportunity to design and conduct their own research while working closely with a faculty mentor, at the top Cognitive Science Program in the country, for a full academic year.

Students selected for the program may enroll in up to 17 credits per semester, but will be expected to devote a minimum of 6 credit hours per semester to research. Students will also have the option to enroll in our outstanding undergraduate courses. The Cognitive Science Undergraduate Program stresses skills acquisition, and aims to foster the abilities that make students into scientists.

The program can provide the following important opportunities and
experiences:
- Improve your chances of being accepted to a top graduate program
- Build your CV with invaluable lab research experience not available at your home institution
- Design your own research projects
- Work closely with a faculty mentor
- Participate in symposia and colloquia with IU’s distinguished and highly accomplished Cognitive Science faculty
- Learn how to prepare and submit research for publication

Students applying to the Visiting Undergraduate Program must meet the following requirements to be considered for admission:
- Have junior or senior class standing (in exceptional cases, fellowships may be awarded to students with sophomore standing, but such applications are not encouraged).
- A minimum GPA of 3.3 on a 4.0 scale.
- A background in computer science, mathematics, neuroscience, philosophy, or psychology, or some combination thereof.

Students who are accepted to the program will receive an out-of-state tuition waiver. Students will be responsible for the cost of in-state tuition and fees (approximately $8,000 for the year) and the cost of room and board.
To apply, students must submit an application form and materials checklist, which can be printed from the following web site:
http://www.cogs.indiana.edu/academic/visit.html , or downloaded by right-clicking the link and choosing the appropriate option in your browser. In addition students must submit a 1-2 page personal statement describing the research they would like to pursue; identifying, if possible, the IU faculty member(s) with whom they would like to do this research; CV; Official Transcript; SAT or GRE scores and three letters of recommendation. IU Cognitive Science affiliated faculty may be found on the Cognitive Science program home page at: http://www.cogs.indiana.edu/index.asp .

Students who are invited to participate will receive an application for admission to Indiana University. The application must be completed and returned to the Office of Admissions. Visiting Undergraduate Research Fellows must be accepted to Indiana University in order to participate in the program. Students accepted to the program will be classified as transfer students for the year that they are in residence at IU.

The above information should be submitted to:
Cognitive Science Program
Eigenmann, Room 817
Indiana University
1910 E. 10th St.
Bloomington, IN. 47406-7512
cogsadv@indiana.edu

The application deadline is March 14, 2008. Those who are accepted will be notified by mid-April.

Reading Two Posts

Wednesday, January 16th, 2008

The following two items recently culled from the blogosphere merit simultaneous reading by the optically dexterous (see accompanying figure):

Item 1. “What Kind of Philosophy Gets in the News?” @Leiter Reports (w/ guest poster Jason Stanley).
Excerpt:

The popular press will not be producing articles on Field, Fine, Raz, or Stalnaker’s recent work, despite the fact that these philosophers produce work that is among the most admired by other philosophers.

Item 2. “Opinion Leaders Impotent in Ideas Economy” @Mind Hacks
Excerpt:

[L]arge numbers of people would embrace a particular idea when a certain number of their more easily influenced peers started to champion it.

Wegman
William Wegman (American, b. 1943). Reading Two Books, 1971. Gelatin silver print.

Inducing Out-of-Body Experiences

Tuesday, October 9th, 2007

Philosopher Thomas Metzinger emailed me a bunch of cool stuff he’s doing with some neurosicentists to utilize virtual reality to induce out-of-body experiences.

From “The embodied self: Using virtual reality to study the foundations of bodily self-consciousness“:

The “I” one thinks of as “myself” is inextricably attached to one’s bodily location. In patients with certain neurological conditions this sense of spatial unity can break down, causing disturbing sensations such as out-of-body experiences in which the global self is localized outside one’s body limits (often called disembodiment).

Previous experiments have shown that people may attribute fake body parts to their own bodies. In the “Rubber Hand Illusion”, a person’s unseen hand is stroked synchronously with a visible fake hand, and then the person is asked to point to his own hand. Subjects invariably err in the direction of the fake hand, attributing it to their own bodies. Because the attribution does not involve the whole body, the sense of global bodily self-consciousness is not affected. EPFL Professor Olaf Blanke, graduate students Bigna Lenggenhager and Tej Tadi, and philosopher Thomas Metzinger hypothesized that the same approach could be used to study the concept of global bodily self consciousness by using a single, coherent body representation instead of just a body part.

See also:
New Scientist, “Out-of-body experiences are ‘all in the mind’
New York Times, “Studies Report Inducing Out-of-Body Experience“.

What’s so metaphorical about the computer metaphor?

Thursday, June 7th, 2007

It’s common for people to speak of the computer metaphor and a recent post over at the Brains blog inspired the following thought.

There is a sense of the verb “compute” whereby many, if not all, people compute insofar as they calculate or figure stuff out. Insofar as they literally compute, they literally are computers. Further, the use of “compute”, “computing”, and “computer” as applied to non-human machines is derivative of the use as applied to humans.

It strikes me as a bit odd, then, to say that calling people or their minds “computational” is something metaphorical.

Stochastic Resonance in Vision

Saturday, May 12th, 2007

Stochastic resonance described in Mind Hacks:

…adding noise to a signal raises the maximum possible combined signal level. Counterintuitively, this means that adding the right amount of noise to a weak signal can raise it above the threshold for detection and make it easier to detect and not less so.

Link to demo: [Link]

without noise
with less noise

with noise
with more noise

Sight without Light

Sunday, May 6th, 2007



Contact

Originally uploaded by Pete Mandik.

Eric Schwitzgebel’s got a cool question: ” With Your Eyes Closed, Can You See Your Hand in Front of Your Face?

At one point, Eric describes something apparently oft discussed amongst spelunkers:

But on the other hand, most people, deep in a cave where there isn’t a single photon to pierce the darkness, will report being able to see their hands moving in front of their faces. That this isn’t a matter of picking up on visual stimulus is made clearer by our inability in such situations to detect another person’s hand waved before our faces. It seems that our knowledge of the movement of our hand is somehow affecting our visual experience, or at least our judgments about our visual experience, without actually causing any visual input.

Such a phenomenon is a pretty extreme case of what I call “Underdetermined Perception” of the “Active Perception” variety in my paper “Action-Oriented Representation“.

The perception of illusory contours is just one kind of underdetermined perception. The focus of this chapter is another kind of underdetermined perception: what I shall call “active perception”. Active perception occurs in cases in which the percept, while underdetermined by sensation, is determined by a combination of sensation and action.

An …example of …active perception is reported by Lenay et al. (1997) and Hanneton et al. (1999). Subjects use a tactile based device to identify simple 2-dimensional forms such as broken lines and curves. The subjects wear a single tactile stimulator on a fingertip. The stimulator is driven by a magnetic pen used in conjunction with a graphic tablet. A virtual image in black and white pixels is displayed on a screen that only the experimenter is allowed to see. The subject scans the pen across the tablet and thus controls a cursor that moves across the virtual image. A stimulus is delivered to the fingertip only when the cursor is on pixels that make up the figure and not on background pixels. Subjects with control over the pen are able to identify the images. Subjects that merely passively receive the tactile information cannot.

References
Hanneton S., Gapenne O., Genouel C., Lenay C., Marque C. (1999). “Dynamics of shape recognition through a minimal visuo-tactile sensory substitution interface.” Third Int. Conf. On Cognitive and Neural Systems. pp. 26-29.

Lenay C., Cannu S., Villon P. (1997). “Technology and perception : the contribution of sensory substitution systems.” In Second International Conference on Cognitive Technology, Aizu, Japan , Los Alamitos: IEEE, pp. 44-53.


Mental Representations in Non-Human Animals

Wednesday, March 7th, 2007



My Goat, It’s Full of Stars

Originally uploaded by Pete Mandik.

It is worth noting that the power of representational explanation is not simply some story we tell ourselves and each other sustained by our own (possibly mistaken) views of ourselves. One way to appreciate the power of such explanations is to appreciate them in the context of explaining the behaviors of non-human animals. The literature is filled with such examples. Here are just a few.

Consider the impressive feats of maze learning exhibited by rats. A Morris water maze is filled with water rendered opaque to obscure a platform that will offer a rat a chance to rest without having to tread water. When placed in the maze for a first time, a rat will explore the area and eventually find the platform. When the rat is returned to the starting position, the rat does not repeat the exploratory strategy but instead swims straight to the remembered location of the platform. Apparently, the perceptual inputs gained during the exploration were utilized to compute the straight-line path to the platform. The rat’s behavior is thus explicable in terms of psychological states such as perceptions and memories and computations that operate over them.

Gallistel (1990, The Organization of Learning) describes another such example:

Every day two naturalists go out to a pond where some ducks are overwintering and station themselves about 30 yards apart. Each carries a sack of bread chunks. Each day a randomly chosen one of the naturalists throws a chunk every 5 seconds; the other throws every 10 seconds. After a few days experience with this drill, the ducks divide themselves in proportion to the throwing rates; within 1 minute after the onset of throwing, there are twice as many ducks in front of the naturalist that throws at twice the rate of the other. One day, however, the slower thrower throws chunks twice as big. At first the ducks distribute themselves two to one in favor of the faster thrower, but within 5 minutes they are divided fifty-fifty between the two “foraging patches.” … Ducks and other foraging animals can represent rates of return, the number of items per unit time multiplied by the average size of an item.

In both the cases of the rats and the ducks, the ultimate explanation called for is going to require mention of some relatively subtle mechanisms inside of the animals that are sensitive to properties of the environment. To get a feel for what might be called for, contrast the way in which we would explain, on the one hand, the movements of the rat toward the platform or the duck toward the bread and, on the other hand, a rock falling toward the earth. The rock’s movement is explained by a direct appeal to a fundamental force of nature that constitutes the attraction between the respective masses of the earth and the rock. Such a direct appeal to a fundamental force will not explain the rat’s movement to the platform. This is not to say, of course, that something non-physical is transpiring between the rat and the platform. There is of course energy flowing between the two that impacts the rat in ways that ultimately explain its behavior. But unlike the case of the rock, the transference of energy from platform to rat will only have an impact on the rat’s behavior insofar as the rat is able to transduce the information carried by that energy into a code that can be utilized by information processing mechanisms in its central nervous system. Such mechanisms will be able to store information in the form of encoded memories and make comparisons between encoded memories and current sensory input to compute a course of action toward a goal state.

(Adapted from Mandik, Collins, and Vereschagin (in press). “Evolving artificial Minds and Brans“. in Mental States. Vol.1: Evolution, Function, Nature, eds. Andrea C. Schalley and Drew Khlentzos. John Benjamins Publishing Company.)

Shit Happens

Friday, January 5th, 2007


Life During War Time

Originally uploaded by Pete Mandik.

I just got the thumbs-up from David Coady on my proposed contribution to a special issue of Episteme he’s editing on the epistemology of conspiracy theories. The title, “Shit Happens,” comes from, among other places, an epigraph in Brian Keeley’s J. Phil. article “Of Conspiracy Theories.” My article should be of interest to Brain Hammer readers interested in the function and evolution of folk-psychology.

Here’s the abstract:

In this paper embrace what Brian Keeley calls in “Of Conspiracy Theories” the absurdist horn of the dilemmma for philosophers who criticize such “theories”. I thus defend the view that there is indeed something deeply epistemically wrong with conspiracy theorizing: conspiracy theories over-extend intentional explanation and attribute reason where reason does not apply. Along the way I explore some of the cognitive bases for the kind of totalizing intentional explanation of which conspiricay theories are but one instance (much religious thinking constitutes further instances). I speculate as to the evolutionary basis of such explanations. The evolutionary origins of intentional explanation, and thus the niche for which they were adapted, concern tracking relatively small numbers of agents in relatively small social dominance hierarchies. But attempts to apply reason-based explanations to numbers of agents approaching global scales over large chunks of history is as inappropriate as applying them to inanimate objects. Nonetheless, the urge to do so–the urge to theorize conspiritorily–is itself in need of an explanation and I explore what cognitive or psychological factors might underly this urge.


A Book, a Chapter, and Two Bald Heads

Thursday, January 4th, 2007

A book I co-authored, Cognitive Science: An Introduction to the Mind and Brain, is out now. [Link to Routledge's page for the book.] If you have a mind and/or a brain, but have not yet been introduced to them, this may be the book for you.

Also out now is The Blackwell Companion to Consciousness. With a title like that, you can probably guess it was published by Blackwell. [Link to guess-who's page for the book.] If you are conscious and in need of a companion, this may be the book for you. This book contains many excellent chapters by many excellent people. It also contains a chapter by me: “The Neurophilosophy of Consciousness.” [Link to draft of my chapter.] Here’s the abstract:

The neurophilosophy of consciousness brings neuroscience to bear on philosophical issues concerning phenomenal consciousness, especially issues concerning what makes mental states conscious, what it is that we are conscious of, and the nature of the phenomenal character of conscious states. Here attention is given largely to phenomenal consciousness as it arises in vision. The relevant neuroscience concerns not only neurophysiological and neuroanatomical data, but also computational models of neural networks. The neurophilosophical theories that bring such data to bear on the core philosophical issues of phenomenal conscious construe consciousness largely in terms of representations in neural networks associated with certain processes of attention and memory.

Regarding the books, both volumes bear handsome cover illustrations of bald heads. Until someone figures out a better way to draw a picture of the mind, we are going to be stuck with bald heads.