How can museums change adolescents-and vice versa?


A post by Chelsea Kelly of Art Museum Teaching that shares insights for engaging teens at the museum and... vice versa! In translation by Ilaria

This is my translation of the article, written by Chelsea Kelly and published on the blog Art Museum Teaching, about the change that is possible to have in teenagers through student programs within a museum. Here is the link to the original. Happy reading!

Over the past four years, I have worked with hundreds of teenagers, from Milwaukee and the surrounding area, who love art and whose love for museums also increased during teen programs at the Milwaukee Art Museum.



I always felt that my students grew in the time they spent at the Museum. This year, to really analyze this growth, we proposed to make our Satellite Program for High Schools, which had already been tried and tested for so many years, a year-long experience, aimed at discovering exactly how weekly encounters in an art museum can change the thinking of our participating kids . The purpose of our program was to have the boys show a growing ability to reflect on their own experiences and achievements.

This led me to think long and hard about evaluation: how do we demonstrate that change has happened? Years ago, I used to think that assessment was more or less a thorny, but clear and necessary topic that required me to use math a lot; but in the last couple of years, I have changed my mind and believe that assessment is exactly the opposite (although math is still important!). Assessment is a gray area, almost like teaching and interpretation, and we as educators need to use multiple methods to get a more complete picture of our students. And also, these methods can be tools that help us in teaching, improving programs and our impact on students.

Eventually, I realized that I needed to reflect myself to understand how my students were changing and to experiment with various ways to express myself about their growth. In this article, I will share some methods I used this year in the High School Satellite Program to find out how our teenagers changed through reflection.

First of all. What is the Satellite Program?
The Satellite Program for High Schools is a year-long internship done by sixteen teenagers ages 16 to 18 from different high schools around Milwaukee. Once a week after school, they come to the Museum and analyze how art can be relevant to their lives today. They participate in hands-on studios (hour-long discussions on a single work of art) and workshops on how to write a resume, talk to staff about what their profession is like behind the scenes, and also mentor elementary school students visiting the permanent collection. The teens create a final project that has a real impact on the Museum. They choose a work of art from the Collection, research it, and formulate their own interpretation about the work. In past years, students have created their projects using visual art, writing or performance.

This year, they use iPads to make videos about the chosen artwork, give their own interpretation and explain how this work has changed their thinking or art practice.

Famiglie al Satellite Program di Milwaukee
Families at the Satellite Program in Milwaukee

Assessment
Let’s start with the basic method of evaluation we were using for this program. We were fortunate to work with one of the founders of the children’s program, the Milwaukee Public Schools Partnership for the Arts & Humanities and the Center for Urban Initiatives and Research (CUIR) at the University of Wisconsin-Milwaukee, to develop the project just described and establish a tool for its evaluation. We agreed to have individual interviews, one on the first days of the program, in October, and one on the last days, in May. Each student was asked separately the same set of questions in both the first and last interviews, making them reflect on their experiences during the program. On an agenda, I graded each interview on the level of detail in their answers and then compared the score of the two interviews to see if they had improved.

By the end, each student had improved their thinking skills: their answers were much more detailed. The best thinkers were rewarded by having them use the agenda, to see their interviews as quantitative and more tangible data. Because of its usefulness, this is still our only method of evaluation. Being able to explain in detail is certainly a key aspect of being able to think successfully; but as I was listening to their responses and thinking about what I had seen in the students during the year, I realized that there were many other aspects to think about, not just the details. They were using good vocabulary in their answers, expressing sophisticated ideas, and asking a lot of questions, deeper questions. How could I expose this kind of change?

Unexpected data
Fortunately, on this journey, we came across unexpected data that helped me to see the change in my students more concretely.

Surveys
At the end of each meeting, students used an app on their iPads called Infuse Learning to answer a quick survey at the end of each meeting-a simple way to get a sense of your students. For this Program, they answered questions such as What did you learn today? and What are you still reflecting on? Although different from our interview questions, these are certainly supportive of reflection as they lead to reflections on the day gone by.

Then, during the year, I noticed that the boys’ answers were becoming more sophisticated: they were longer, contained more art-related vocabulary, and they realized that they might not even be able to answer certain questions definitively, if at all. On the advice of Marianna Adams, a research and evaluation specialist within museums, I tried two different readability tests to these answers to see if they could quantify their sophistication. One test produced scoring along the lines of the Fog Scale, which measured the number of syllables and sentence length (a score of 5 if readable, 20 if very difficult). The other measured the score according to the Flesch-Kincaid Grade, that is, the average score needed to read and understand the text.

Un ragazzo parla al museo
A boy talks at the museum

For the first question(What did you learn today?), the scores went up considerably on the Fog Scale and the Readability Level, and since these tests measure the number of syllables, sentence length, and readability grade, they are useful in basic assessment.

But I was surprised to see that when I used the answers to the second question(What are you still thinking about?) for the tests, the students’ scores really went down! Still if you read their answers there is a drastic change for the better.

Let’s take Student D. In his first answer, he asks a question almost about basic art history: the distinction of one kind of art from another. In his last answer, he is thinking deeply about the purpose of art and how we decide what is art. And although Student F uses high-level art history vocabulary in his first response, it is without context; subsequently, he reflects on how two seemingly opposite concepts may have something in common after all.

The scores for these responses may have declined, but I argue that their reflective ability has increased: the adolescents ask great questions that may not be answered; they abandon high-level vocabulary for a more informal one on philosophical issues concerning art, destruction, and race. Dealing with these answers through testing helps me realize once again that although tools can be useful, in the end they are only tools. We need more tools to get a more complete picture.

Video
To have more available for the purpose of a more complete picture, I will share an unexpected final assessment tool: videos of the students’ final project and a meeting-debate during the presentation of the videos.

For the final project, each student had to choose an artwork from the Museum’s Collection and had to observe it, research it, and talk about it with others for seven months. (Since most visitors stop less than 10 seconds in front of a work of art, that’s a feat in itself!). They concentrated a year’s worth of schooling into a short time: 2-4 minute videos to answer what the work meant to them, what it meant to others, and how their own thinking had changed by being in contact with the work-all questions directed obviously at reflection.

The children also participated in a discussion/talk show during the presentation of these final projects. Guests the museum staff, teachers, families and friends asked the boys questions about their experiences. If you are interested, you can see videos of the boys, during the talk show, in the YouTube video linked here.

Impact Can museums change teenagers?
So does everything we have said so far interviews, surveys related to each meeting, and final projects lead to a complete picture of the impact a year of reflective practice can have on students?

I am not sure that we can always get a full picture of a student’s growth during challenging programs like this. However, I think combining all of these tools can be useful, especially if the assessment tools actively support the goal of the program. Interviews, surveys and activities were intentionally structured to stimulate reflection, linked to the outcome itself.

This relevance was critical, not only in authentically assessing the success of the program, but also in supporting students’ skills through the methods themselves. It was important, too, that we educators made it clear to the students the goal of the program. The students knew from the beginning that they would be working on reflective practice; this helped them first of all to think reflectively from the beginning.

Regarding the impact it had on reflective skills, I also want to share some of the kids’ own comments about their time in the program: The videos helped us think more deeply about what we do, so even in school I think more deeply about what I’m doing or why something was done or why it happened.

I learned that I don’t have to judge a book by its cover. When I first saw my work, I immediately thought it was just a bunch of different colors and I didn’t really think about what it meant. But now I have learned that it actually had a super fantastic meaning and I would never have understood it if I hadn’t had the chance to find out. So I don’t have to be too quick to judge. We had to lead the way, and I found that I really enjoy working with children and art at the same time. I’d like to go into a profession that has to do with art education for elementary school students. I’ve been able to change and develop my way of thinking; now I’m able to not be trivial.... I learned that art holds all the answers to the questions we ask, we just have to look for them.

From the other assessment tools, we saw that students developed the ability to reflect on themselves and their performance. But as we read in the comments above, they were also able to develop skills reflecting on the outside world-the world of art history, their future professions, how they interact with other people. These are all ways of thinking that are valuable for their future as they go to college, as they discover their passions and pursue meaningful job opportunities.

Can teenagers change museums?
I have shown how this program has helped students grow in so many ways. What can we say about the museum itself? Have these students had an impact on our institutional practice?

Institutions are slower to change than most programs, and if change and impact are difficult to assess in sixteen students, then it is ten times as complicated for an organization that has hundreds of thousands of visitors a year. Nevertheless, over the past few years, the work of the students in our programs has slowly but surely led to changes within the Museum. The boys have interviewed artists on behalf of the institution. They have given advice to faculty on how to mentor high school students. Their video projects will be part of on-site and online Museum Collection and Archives Services for all visitors who come and want to discover the artworks.

In conclusion, evaluation and impact continue to be a gray area that has much in common with teaching itself. When done well and intentionally, evaluation does not just show whether we have achieved the goal. The tools we use to ideally assess become part of our teaching because they reinforce the very skills we are trying to get students to develop.

Am I still reflecting on this? This year, our assessment methods mostly require teenagers to have specific skills, such as answering questions at the end of each meeting or knowing how to use an iPad (although we were doing video-making workshops). I am thinking of other ways to collect data holistically. For example, since much of our assessment methods stem from teaching tools, should I document or film our discussions about artworks and find ways to analyze them? I would love to hear about some ideas or tools you use to evaluate your programs, just as I hope this article has inspired you to take a fresh look at your teaching systems and find unexpected ways to see the growth of your participants.


Warning: the translation into English of the original Italian article was created using automatic tools. We undertake to review all articles, but we do not guarantee the total absence of inaccuracies in the translation due to the program. You can find the original by clicking on the ITA button. If you find any mistake,please contact us.