1. K-State home
  2. »Arts and Sciences
  3. »Psychological Sciences
  4. »People
  5. »Graduate Students
  6. »Ryan Ringer

Department of Psychological Sciences

Ryan Ringer

 

Ryan RingerRyan Vance Ringer, M.S.

Email: rvringer@k-state.edu

Office: Bluemont Hall 5107

Advisor:  Dr. Lester Loschky

Curriculum Vitae

Google Scholar Profile

ResearchGate Profile

I am interested in visual perception and cognition as it relates to our understanding of the real-world environment, with an emphasis on the dynamics of attention in space and time.  I am also interested in understanding how the unique contributions of central and peripheral vision allow us to efficiently scan our environment and extract useful information from it.  These two interests have come together in a collaborative research effort, funded by the Office of Naval Research, to develop a measure of visual attention across the visual field that is (a) not confounded by eccentricity-dependent low-level perceptual attributes (e.g., acuity, contrast sensitivity), and (b) can be used in driving or flight simulators.  I am currently using this novel method to measure the spatiotemporal dynamics of attention during event perception to investigate the relationship between perception, attention, and memory.  Although the majority of my experiment programming has been in Experiment Builder (SR-Research), I am currently advancing my programming knowledge in MatLab and Python.

In addition to a wide-range of research interests, I also strongly emphasize the importance of rigorous quantitative analysis.  Thus, I employ a diverse collection of quantitative methods, including general, generalized linear and non-linear mixed modeling as well as other descriptive techniques, like multidimensional scaling and bivariate contour ellipses.  I am proficient with SPSS, JMP, however R is my primary language for data analysis.  In addition to experimental design and philosophy of science, I have made statistical literacy a primary learning objective for my undergraduate research assistants.

EDUCATION

  • Kansas State University, Ph.D. in Psychological Sciences – 2018 (expected)
  • Kansas State University, M.S. in Psychological Sciences – 2016
  • Kansas State University, B.S. in Psychology – 2010 

PEER-REVIEWED PUBLICATIONS

  • Ringer, R.V., Throneburg, Z., Johnson, A.P., Kramer, A.F., & Loschky, L.C. (2016). Impairing the Useful Field of View in natural scenes: Tunnel vision versus general interference. Journal of Vision, 16(2):7, 1-25. doi: 10.1167/16.2.7.

  • Gaspar, J.G., Ward, N., Neider, M.B., Crowell, J., Carbonari, R., Kaczmarski, H., Ringer, R.V., Johnson, A.P., Kramer, A.F., & Loschky, L.C. (2016). Measuring the useful field of view during simulated driving with gaze-contingent displays. Journal of Human Factors and Ergonomics Society, 58(4), 630-641.

  • Loschky, L.C., Ringer, R.V., Ellis, K., & Hansen, B.C. (2015).  Comparing rapid scene categorization of aerial and terrestrial views: A new perspective on scene gist. Journal of Vision.

  • Larson, A.M., Freeman, T.E., Ringer, R.V., & Loschky, L.C. (2014). The spatiotemporal dynamics of scene gist recognition. Journal of Experimental Psychology: Human Perception and Performance, 40(2), pp. 471-487.

  • Loschky, L.C., Ringer, R.V., Johnson, A.P., Larson, A.M., Neider, M., & Kramer, A.F. (2014). Blur Detection is unaffected by cognitive load. Visual Cognition, 22(3-4), pp. 522-547.

  • Ringer, R.V., Johnson, A.P., Gaspar, J.G., Neider, M.B., Crowell, J., Kramer, A.F., Loschky, L.C. (2014, March). Creating a new dynamic measure of the useful field of view using gaze-contingent displays. Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 59-66.

CONFERENCE TALKS

  • Ringer, R. V., Johnson, A. P., Gaspar, J. G., Neider, M. B., Crowell, J., Kramer, A. F., & Loschky, L. C. (2014, March). Creating a new dynamic measure of the useful field of view using gaze-contingent displays. Talk presented at the 2014 Symposium on Eye Tracking Research and Applications, Safety Harbor, FL.

RECENT CONFERENCE POSTERS:

  • Ringer, R.V. (2017, May). Competition between foveal and peripheral attention reveals evidence in favor of a zoom-lens model of attention. Annual Meeting of the Vision Sciences Society, May 24, 2017.

  • Peterson, J.J., Ringer, R.V., Sisco, E., de la Torre, M., Talkington, H., Shanahan, M., & Loschky, L.C. (2017, May). The effects of unique blur/clarity contrast on visual selective attention as measured by eye movements: strong clarity capture and weak blur repulsion, Annual Meeting of the Vision Sciences Society,May 22, 2017.

  • Ringer, R.V., Throneburg, Z., Johnson, A.P., Kramer, A.F., & Loschky, L.C. May, 2016). The effects of foveal and auditory loads on covert and overt attention. Annual Meeting of the Vision Sciences Society, May 18, 2016.

  • Ringer, R.V., Throneburg, Z., Walton, T., Erickson, G., Coy, A., DeHart, Johnson, A., Kramer, A.F., & Loschky, L.C. (2015). A novel approach to measuring the useful field of view in simulated real-world environments using gaze-contingent displays: The GC-UFOV. Annual Meeting of the Vision Sciences Society, In Journal of Vision, 15(12), 878.

  • Ringer, R.V., Throneburg, Z.W., Walton, T., Erikson, G., Coy, A., DeHart, J., Johnson, A.P., Kramer, A.F., & Loschky, L.C. (2015, November). General interference is produced by both auditory and visual dual-task loads, but tunnel vision requires a foveal load. Annual Meeting of the Psychonomics Society, November 19, 2015.

    Loschky, L.C., Ringer, R.V., Throneburg, Z., Kramer, A.F., & Johnson, A.P. (2015, August). The useful field of view in real-world scene viewing: Tunnel vision versus general interference. The 18th European Conference on Eye Movements, Vienna, Austria. 

HONORS/AWARDS

  • Paper “Blur Detection is Unaffected by Cognitive Load” was included in a special online article collection of most downloaded papers in Routledge Behavioral Sciences journals in 2014 (February, 2015).

  • Award for Best Long Paper and Best Student Paper for “Creating a New Dynamic Measure of the Useful Field of View Using Gaze-Contingent Displays.” Prize included a free Eyetribe© eye-tracker (March, 2014).

  • Article feature in the American Psychological Association’s PEEPS (Particularly Exciting Experiments in Psychology) for “The Spatiotemporal Dynamics of Scene Gist Recognition” (February 2014, Issue 15)

  • Awarded a $1000 undergraduate research grant by the Kansas Space Grant Consortium for research in human perception of satellite imagery (Spring, 2010).