Simulating Religious Violence
- Description
- Reviews
- Citation
- Cataloging
- Transcript
Program it. Predict it. Prevent it. An expert crew of computer scientists and religion scholars embark on a three-year project to apply computer simulation and modeling to find solutions to worldwide humanitarian crises. Called to action by the Boston Marathon Bombing and increasing religious extremist terrorist attacks in North America and Europe, the scientists develop cutting-edge technology at their headquarters in research centers in Boston and Virginia as well as at a Norwegian university. The team eventually travels to refugee camps in Lesvos, Greece, to understand and simulate connections between religious extremism and the refugee crisis. They use the powerful modeling and simulation methodology to develop policy recommendations for predicting and preventing religious radicalization and violence.
Video Librarian | Christopher Spicer
"For academic librarians, the film is well-suited to religious studies, political science, computer science, ethics, and sociology courses. It can also enhance media literacy discussions by showing how simulation can serve humanitarian, not just analytical, goals."
Credits and citation support are not available for this title yet.
A MARC record for this title is not available yet.
Distributor subjects
Technology & Society; Media Stereotypes of Muslims; Islamophobia; Simulation Theory & AI ; Intercultural, Interreligious, and Interfaith Dialogue; Religious Extremism; Boston Marathon BombingKeywords
00:00:29.843 --> 00:00:34.114
I actually was at home because
it was marathon Monday.
00:00:34.200 --> 00:00:39.071
The Temple Israel, my synagogue,
was closed on that day.
00:00:39.880 --> 00:00:42.160
There's an opportunity to think about
00:00:42.240 --> 00:00:45.040
going out to watch some of the
runners come in or whatever.
00:00:45.129 --> 00:00:48.500
I watched some of the marathon
00:00:48.800 --> 00:00:53.880
on television and was half
tuned into the marathon.
00:00:53.963 --> 00:00:55.600
The first runners had come in.
00:00:55.900 --> 00:00:58.514
The 117th Boston Marathon,
00:00:58.600 --> 00:01:00.600
one of the world's
great sporting events.
00:01:03.486 --> 00:01:04.471
For the first time,
00:01:04.560 --> 00:01:08.520
jurists saw images yesterday that led
the FBI to the Tsarnaev brothers.
00:01:08.600 --> 00:01:10.440
Chilling new video shows Tamerlan
00:01:10.460 --> 00:01:13.760
and Dzhokhar Tsarnaev weaving
through the crowd on Boylston street.
00:01:13.843 --> 00:01:15.086
The second blast.
00:01:17.086 --> 00:01:21.214
If there's any lingering doubt,
I did do it along with my brother.
00:02:02.880 --> 00:02:08.440
The day of the marathon, of the bombing,
I was in the church, I was up in the tower
00:02:08.520 --> 00:02:13.480
overlooking the marathon finish line,
about 246ft above ground,
00:02:13.563 --> 00:02:22.329
just looking down, and I think at the same
moment heard and saw an explosion.
00:02:22.886 --> 00:02:26.200
And we are looking down and we
are on our phones and we're
00:02:27.243 --> 00:02:31.457
beginning to get a picture of the horror
that has transpired on the streets.
00:02:35.720 --> 00:02:38.440
I mean, this is a cherished
event in our history.
00:02:38.529 --> 00:02:41.643
This is arguably the most special
day of the year in Boston.
00:02:42.080 --> 00:02:43.680
This was happened in our city.
00:02:43.760 --> 00:02:44.760
These are our people.
00:02:44.840 --> 00:02:48.040
This is our event that we've
been covering for decades, right?
00:02:48.129 --> 00:02:52.471
This is, the Boston Marathon has
been happening on our streets
00:02:52.557 --> 00:02:54.671
since before World War I.
00:02:54.760 --> 00:02:55.920
This was a huge story,
00:02:56.003 --> 00:02:59.800
not just for Boston, but for the country
and for certain parts of the world anyway.
00:03:00.243 --> 00:03:01.237
Who did this?
00:03:01.320 --> 00:03:03.214
Who's behind this, right?
00:03:03.300 --> 00:03:05.071
How could this have happened?
00:03:05.160 --> 00:03:07.640
How could this have been?
Right?
00:03:07.729 --> 00:03:09.871
And who would target the marathon?
00:03:11.086 --> 00:03:12.329
Good morning, everybody.
00:03:13.040 --> 00:03:17.400
I've just been briefed by my national
security team on the attacks in Boston.
00:03:17.486 --> 00:03:19.986
This was a heinous and cowardly act.
00:03:20.560 --> 00:03:23.320
And given what we now know about what took
00:03:23.400 --> 00:03:26.640
place, the FBI is investigating
it as an act of terrorism.
00:03:26.720 --> 00:03:31.160
Affirming his Muslim faith, he added,
I prayed for Allah to bestow his mercy
00:03:31.240 --> 00:03:35.400
upon the deceased, those affected
in the bombing and their families.
00:03:35.480 --> 00:03:38.040
I pray for your relief, for your healing.
00:03:38.080 --> 00:03:39.200
How religion played a role
00:03:39.280 --> 00:03:42.640
in the Boston Marathon bombing is
a large and complicated question.
00:03:42.800 --> 00:03:46.757
Tamerlan had been an unusual character.
00:03:46.840 --> 00:03:49.360
He had had some struggle psychiatrically.
00:03:49.400 --> 00:03:52.800
I think he had told his mother at one
point that he, quite rather young,
00:03:52.880 --> 00:03:58.080
I think, as a teenager, that he had heard
voices in his head and he latched onto
00:03:58.163 --> 00:04:02.671
Islam as one way to keep
himself from drowning, I think.
00:04:03.186 --> 00:04:04.637
Clearly Dzhokhar and Tamerlan were
00:04:04.720 --> 00:04:07.114
learning about Islam
on the computer screen.
00:04:07.880 --> 00:04:11.840
The things they were reading, Inspire
magazine alone, which was at the time
00:04:11.920 --> 00:04:16.080
the place to go to learn about this.
You know, it's a click of a button,
00:04:16.120 --> 00:04:19.200
and there you can read all the theology
and justification of violence.
00:04:19.240 --> 00:04:22.520
And by the way, here's how you make
a bomb in the kitchen of your mom.
00:04:22.603 --> 00:04:24.157
And that's exactly what they did.
00:04:24.600 --> 00:04:25.929
It wasn't very hard.
00:04:28.280 --> 00:04:31.560
I think their disenfranchisement
is a big part of the story.
00:04:31.640 --> 00:04:34.240
That has absolutely nothing
to do with religion.
00:04:34.320 --> 00:04:38.200
It has to do with them feeling like
outcasts, them not being respected,
00:04:38.280 --> 00:04:41.760
them coming from broken families
and not having good examples.
00:04:41.840 --> 00:04:44.480
Personally, I do not believe that it's
00:04:44.560 --> 00:04:47.320
religion that drives
people to commit violence.
00:04:47.400 --> 00:04:53.040
It's really their social, economical
situation that really drives them.
00:04:53.120 --> 00:05:01.240
It still is very painful for our community
not able to forget what happened and not
00:05:01.329 --> 00:05:04.371
being able to explain why it
happened the way it happened.
00:05:04.957 --> 00:05:08.943
There isn't an answer that could satisfy
00:05:10.760 --> 00:05:15.400
or that could answer, or even justify,
come even close to justifying that.
00:05:15.486 --> 00:05:17.614
I don't think that.
I know there isn't any.
00:05:18.560 --> 00:05:21.440
Welcome to the Cathedral of the Holy Cross
00:05:21.520 --> 00:05:26.840
in Boston South End for a special
presentation and interfaith service
00:05:26.929 --> 00:05:28.271
healing our city.
00:05:28.920 --> 00:05:30.504
Boston prays together.
00:05:30.704 --> 00:05:31.800
After the bombing,
00:05:31.840 --> 00:05:35.080
there was this sort of tremendous
effort to really bring people together.
00:05:35.163 --> 00:05:38.029
And I think the religious leaders
were really at the center of that.
00:05:46.314 --> 00:05:49.171
And we stand together, and
nobody's going to divide us.
00:05:49.514 --> 00:05:55.743
And so we come together to pray
and mourn and measure our loss.
00:05:57.800 --> 00:06:05.160
But we also come together today
to reaffirm that the spirit of this city
00:06:05.243 --> 00:06:11.271
is undaunted and the spirit of this
country shall remain undimmed.
00:06:12.080 --> 00:06:14.120
What does this have to do with religion?
00:06:14.200 --> 00:06:16.080
I think what has to do with religion is
00:06:16.160 --> 00:06:19.960
the response, is the aftermath,
is the healing, is the pulling people
00:06:20.043 --> 00:06:26.029
together, is the overcoming hate,
refusing to slide into Islamophobia.
00:06:26.480 --> 00:06:28.000
I think that's what,
00:06:28.086 --> 00:06:31.186
I think that's the religion story
in the Boston Marathon bombing.
00:06:33.840 --> 00:06:35.920
Why does religious violence happen?
00:06:36.000 --> 00:06:37.920
There's no simple answer.
00:06:38.000 --> 00:06:40.480
After every attack, many questions arise.
00:06:40.563 --> 00:06:43.029
Lots of people try
to explain why it happened.
00:06:43.686 --> 00:06:45.971
We think that if we can
understand religious violence
00:06:46.057 --> 00:06:47.914
we might be able to prevent it.
00:06:48.480 --> 00:06:51.280
In universities and think tanks worldwide,
00:06:51.363 --> 00:06:55.486
researchers and academics are studying
and trying to explain religious terrorism.
00:06:55.840 --> 00:07:01.920
ISIS is a little bit like
the United States was in the 19th century.
00:07:02.000 --> 00:07:04.400
You know the motto
on the Statue of Liberty?
00:07:04.480 --> 00:07:07.360
Give me your poor, your needy,
your huddled masses?
00:07:07.440 --> 00:07:09.280
Well, ISIS says, come to us.
00:07:09.360 --> 00:07:11.920
Anybody, come to us. You have problems?
00:07:12.000 --> 00:07:16.720
It appeals to a fairly wide range of young
people frustrated by the fact that their
00:07:16.800 --> 00:07:20.240
aspirations, they believe,
can't ever be met.
00:07:20.329 --> 00:07:24.800
And ISIS has been pretty successful
recruiting in 110 countries.
00:07:28.286 --> 00:07:32.043
So then I decided to go to the field
and hang around with jihadis
00:07:33.886 --> 00:07:35.571
to figure out what really moves them.
00:07:37.714 --> 00:07:41.329
What I found was they believe
in what they're doing.
00:07:42.360 --> 00:07:44.280
They find a sense of purpose
in community,
00:07:44.360 --> 00:07:48.360
which are the most important things,
that none of them are really crazy.
00:07:48.440 --> 00:07:53.840
So people join and are willing to die
for their friends who also have ideas,
00:07:53.920 --> 00:07:59.120
so they can have sort of eternal glory
in the eyes of people who they care about.
00:07:59.200 --> 00:08:03.880
So the best predictor of who joins these
groups is if their friends join them.
00:08:03.960 --> 00:08:08.160
And that's why we introduced
identity fusion into the equation.
00:08:08.243 --> 00:08:13.457
And identity fusion is
a sense of visceral oneness.
00:08:13.840 --> 00:08:18.400
So when we did our work with ISIS,
we found that these two factors were
00:08:18.480 --> 00:08:23.000
extremely important, and that is the ideas
and the group dynamics,
00:08:23.080 --> 00:08:26.800
the fusion of the groups, and that made
them the most effective fighters by far.
00:08:26.886 --> 00:08:31.514
The only thing that's important to me now
is how ideas can be used to change policy.
00:08:32.280 --> 00:08:36.800
At my stage in life, writing academic
papers, it doesn't do it for me.
00:08:36.886 --> 00:08:38.771
I have no place to go in a career.
00:08:39.400 --> 00:08:42.880
Basically, I'm in a good
place in terms of that.
00:08:42.960 --> 00:08:48.600
The only thing for me to do
intellectually is to try to change policy
00:08:48.680 --> 00:08:52.640
based on what we know about the world
in terms of the evidence we have at hand.
00:08:52.720 --> 00:08:58.080
But I still think that providing
a clear notion of what's going
00:08:58.163 --> 00:09:03.086
on and the evidence at hand
in the long run, can be very helpful.
00:09:18.160 --> 00:09:19.680
My name is Leron.
00:09:19.763 --> 00:09:24.143
I'm a philosopher of religion
and a professor at a university in Norway.
00:09:25.840 --> 00:09:28.160
As a philosopher, I spend most of my time
00:09:28.240 --> 00:09:31.640
thinking about theories, but I
often get frustrated with the gap
00:09:31.729 --> 00:09:34.814
between theory and application
to these social problems.
00:09:35.200 --> 00:09:36.843
I want to figure out how to take
00:09:36.929 --> 00:09:40.071
scholarly work out of the ivory
tower into the real world.
00:09:41.600 --> 00:09:44.443
I want to know what it would
take to make a difference.
00:09:46.440 --> 00:09:48.200
This is my friend Wesley.
00:09:48.286 --> 00:09:51.343
He's a philosopher of religion
who's also a computer programmer.
00:09:51.920 --> 00:09:54.640
Wesley and I designed a project to build
00:09:54.729 --> 00:09:58.000
computer simulations of religion
in order to understand it better.
00:09:59.160 --> 00:10:00.640
Wesley thinks computer modeling
00:10:00.720 --> 00:10:04.600
and simulation is the ideal tool
for studying the complexity of religion
00:10:04.686 --> 00:10:08.514
and getting a bird's eye view of society
to see how things really work.
00:10:09.400 --> 00:10:12.320
The modeling religion project is
headquartered in Boston,
00:10:12.403 --> 00:10:15.700
just a mile away from the
finish line of the Boston Marathon.
00:10:18.280 --> 00:10:19.600
This is Saikou.
00:10:19.680 --> 00:10:24.000
Saikou is originally from Guinea,
and now he's a computer simulation expert
00:10:24.086 --> 00:10:26.843
from the Virginia Modeling,
Analysis, and Simulation Center.
00:10:27.643 --> 00:10:32.214
He spent years computer modeling war
games, simulation of traffic patterns,
00:10:32.300 --> 00:10:35.457
emergency evacuation schemes,
and flight simulators.
00:10:39.880 --> 00:10:41.080
This is Ross.
00:10:41.160 --> 00:10:43.480
He's a hardcore computer scientist.
00:10:43.563 --> 00:10:46.157
He specializes in handling big data or
00:10:46.243 --> 00:10:49.743
massive amounts of information
that can be mined for useful knowledge.
00:10:51.880 --> 00:10:53.840
Here's Justin.
00:10:53.920 --> 00:10:57.440
He's a religious studies scholar
and a computer modeling expert who is
00:10:57.529 --> 00:11:00.086
programming a way to predict
religious violence.
00:11:01.286 --> 00:11:03.000
Together, we religion scholars
00:11:03.086 --> 00:11:07.214
and computer scientists plan to apply
cutting edge technology and world class
00:11:07.300 --> 00:11:11.814
expert knowledge to real world problems
in order to try to make meaningful
00:11:11.900 --> 00:11:16.071
suggestions to policymakers about how
to improve conditions and cooperation
00:11:16.157 --> 00:11:19.257
across human societies over
the next hundred years.
00:11:20.640 --> 00:11:24.160
Modeling in simulation is used
all the time for everything.
00:11:24.243 --> 00:11:31.900
It's used to create new cars,
it's used to handle flight simulators,
00:11:31.986 --> 00:11:37.400
it's used to build distribution routes,
it's used to manage the economy,
00:11:37.486 --> 00:11:41.286
it's used for everything
that affects you in your daily life.
00:11:41.714 --> 00:11:43.686
One example of a model of human social
00:11:43.771 --> 00:11:47.571
behavior that led to change in the
real world was a traffic model built
00:11:47.657 --> 00:11:50.571
at the Virginia Modeling,
Analysis, and Simulation Center.
00:11:51.114 --> 00:11:55.000
Computer scientists there were able
to show when traffic would be particularly
00:11:55.086 --> 00:11:58.557
bad, and to make recommendations
for a different pattern of traffic light
00:11:58.643 --> 00:12:01.829
rotation and the construction
of frontage roads.
00:12:02.714 --> 00:12:06.414
These plans were sent to the city
council and the changes were made.
00:12:06.757 --> 00:12:08.671
So there's a history of modeling human
00:12:08.757 --> 00:12:12.197
social behavior to find
optimal ways of being together.
00:12:12.280 --> 00:12:14.160
And we wondered if we could apply the same
00:12:14.243 --> 00:12:17.600
principles to modeling religious
radicalization and terrorism.
00:12:18.486 --> 00:12:22.157
It's a very active, energetic
process with thousands of ideas
00:12:22.243 --> 00:12:27.643
flying around and people figuring out
over time which ideas are most important.
00:12:27.960 --> 00:12:29.680
So they figure out exactly what needs
00:12:29.763 --> 00:12:33.786
to be implemented and what doesn't need to
be implemented in the model architecture.
00:12:34.200 --> 00:12:36.800
Who will be America's next top modeler?
00:12:37.329 --> 00:12:41.571
We have before us nine contestants,
but only eight spots.
00:12:45.514 --> 00:12:47.671
We build our models using
social scientific and
00:12:47.757 --> 00:12:50.486
research data
that describes the real world.
00:12:50.960 --> 00:12:52.600
There's a lot of information out there
00:12:52.680 --> 00:12:56.800
about how the world works that can be
collected from a vast amount of scientific
00:12:56.886 --> 00:13:00.557
publications, also from census
data or public databases.
00:13:01.600 --> 00:13:03.320
Our teams have harnessed that data,
00:13:03.403 --> 00:13:07.314
formalized it into computer code,
and then used it to program our models.
00:13:07.757 --> 00:13:11.786
The agents are simulated people that populate the model
00:13:11.986 --> 00:13:15.543
reflect the data about human cognition and social psychology
00:13:16.971 --> 00:13:19.514
So the models are calibrated using data about people
00:13:19.600 --> 00:13:21.671
in the world we live in
00:13:22.086 --> 00:13:26.043
Once we validate that,
the model mirrors a given society in some
00:13:26.129 --> 00:13:29.614
key ways, and we can start to play
with different variables and parameters
00:13:29.800 --> 00:13:33.729
to better understand the complex
relationships between social factors
00:13:34.071 --> 00:13:40.557
such as religious identity, diversity,
geography, education, integration, health,
00:13:40.643 --> 00:13:43.657
gender, and vulnerability
to radicalization.
00:13:59.720 --> 00:14:02.440
You feel like MODRN is
gonna take off this morning?
00:14:02.520 --> 00:14:05.080
I do.
Well, it's gonna at least mini takeoff.
00:14:05.163 --> 00:14:08.071
Cause the big maxi takeoff is in June.
00:14:09.480 --> 00:14:12.200
This one's supported
by the university directly.
00:14:12.280 --> 00:14:15.760
So it's a chance to sort
of get an early start.
00:14:15.843 --> 00:14:20.243
Introduce colleagues here at the
university to folk, to people on the team.
00:14:20.443 --> 00:14:22.129
Why religion?
00:14:22.557 --> 00:14:27.114
For most people on the planet, religion is a
deeply important part of their life.
00:14:27.586 --> 00:14:32.143
And yet, often is left out of the scientific study,
of the academy
00:14:32.329 --> 00:14:35.057
It doesn't have--it's not the center of
the university, as it should be
00:14:35.829 --> 00:14:40.429
Religion can cultivate virtue, it can help give people
00:14:40.914 --> 00:14:46.729
the motivation for their norms, for the
way in which they behave morally in the world.
00:14:46.929 --> 00:14:50.243
But it can also, under certain conditions, lead
to conflict between groups
00:14:50.443 --> 00:14:53.743
So we believe it is a very important
phenomenon to study.
00:14:56.571 --> 00:14:58.486
One of our goals is to develop computer
00:14:58.571 --> 00:15:02.371
simulations that can help us
compare competing policy proposals.
00:15:02.671 --> 00:15:04.657
Take for example, debates about the best
00:15:04.743 --> 00:15:08.257
way to integrate refugees
into the society of a host country.
00:15:08.640 --> 00:15:12.560
A government, an NGO,
the Red Cross, United Nations.
00:15:12.643 --> 00:15:14.257
Any kind of organization will be
00:15:14.343 --> 00:15:18.214
interested in implementing policies
to facilitate a healthy society.
00:15:18.486 --> 00:15:20.714
But they'll have limited
funds and resources.
00:15:21.360 --> 00:15:26.000
Say one has 100 million
euro to promote integration.
00:15:26.080 --> 00:15:30.480
Should we spend that money on cultural
education, teaching arriving refugees
00:15:30.563 --> 00:15:33.271
about social values
such as gender equality?
00:15:33.680 --> 00:15:35.440
Should we spend the money on opportunities
00:15:35.529 --> 00:15:38.329
to integrate or interact
with local host culture?
00:15:39.000 --> 00:15:41.040
Or on providing economic support,
00:15:41.129 --> 00:15:44.086
or language instruction,
or defense and security?
00:15:44.786 --> 00:15:46.943
What our models can help one do is see how
00:15:47.029 --> 00:15:49.900
each of these policies is likely
to play out in the future.
00:15:50.243 --> 00:15:51.514
Compare different outcomes.
00:15:52.160 --> 00:15:53.960
That way, one can make more informed
00:15:54.043 --> 00:15:57.886
decisions about the best policies
to adopt in order to meet one's goals.
00:15:59.280 --> 00:16:00.600
You have an interest in modeling
00:16:00.680 --> 00:16:07.600
and simulation because you want
experts to get those simulations right.
00:16:07.680 --> 00:16:09.720
But the deeper meaning of modeling is
00:16:09.803 --> 00:16:14.843
to help public policy people, and that's
the part that's really challenging.
00:16:14.929 --> 00:16:20.271
We could fail to actually develop models
that can be calibrated to the data.
00:16:20.360 --> 00:16:22.120
The failure would consist especially
00:16:22.203 --> 00:16:29.171
in not delivering
a public policy projection machine
00:16:29.643 --> 00:16:32.871
that would actually be useful
to people in public policy circles.
00:16:39.000 --> 00:16:42.843
I actually was living in Oxford
at the time and I was having a dinner
00:16:42.929 --> 00:16:46.571
party that night and had a bunch
of my friends over for the evening.
00:16:46.657 --> 00:16:48.114
I kept hearing my phone go off.
00:16:48.200 --> 00:16:50.557
It was ding, ding, ding, ding, ding.
It was so annoying.
00:16:50.643 --> 00:16:54.057
And I didn't really want to bother myself
as to why my phone was going crazy.
00:16:54.320 --> 00:16:59.680
I went and checked my phone to see
what all the sounds were about.
00:16:59.760 --> 00:17:02.800
There had been a bombing in Boston during
00:17:02.886 --> 00:17:05.929
the marathon and it gave me
sort of a sense of helplessness.
00:17:07.400 --> 00:17:11.280
I felt the day after the Boston bombing
that I needed to be doing more.
00:17:11.360 --> 00:17:13.720
I should be doing something about this.
00:17:13.800 --> 00:17:15.000
This is what I do.
00:17:15.086 --> 00:17:18.771
My contribution to the academic literature
should not just be for the sake of
00:17:18.857 --> 00:17:19.843
pure knowledge.
00:17:20.600 --> 00:17:25.514
We can understand
why people will bomb
00:17:25.600 --> 00:17:30.840
a marathon, or bring down the buildings
of the two towers, or shoot up a nightclub
00:17:30.920 --> 00:17:36.080
in Paris, or drive a truck
into a Christmas market in Germany.
00:17:36.163 --> 00:17:39.671
And as someone who does modeling
of religion, I can help to answer these
00:17:39.757 --> 00:17:43.600
questions so that we can better prevent
these things from happening in the future.
00:17:52.586 --> 00:17:54.571
At Oxford, I developed questions
00:17:54.657 --> 00:17:58.597
for online survey research about religious
violence, specifically the troubles
00:17:58.686 --> 00:18:01.543
between Catholics and Protestants
in Northern Ireland.
00:18:01.629 --> 00:18:04.300
It happens that these set of questions
actually were very relevant
00:18:04.386 --> 00:18:05.237
to the Boston bombing.
00:18:05.320 --> 00:18:06.560
And we did have a hypothesis.
00:18:06.560 --> 00:18:07.760
We had a hypothesis that the people
00:18:07.840 --> 00:18:11.680
who were going to be more emotionally
affected by the Boston bombing,
00:18:11.763 --> 00:18:16.171
were going to be more fused with
their identity of Boston because
00:18:16.257 --> 00:18:18.943
it's such a visceral feeling
of oneness with your group.
00:18:19.043 --> 00:18:21.271
You internalize your
group as part of yourself.
00:18:21.357 --> 00:18:25.157
So to distance yourself from your group is
to distance yourself from yourself.
00:18:25.360 --> 00:18:30.640
The people who are more fused are more
likely to endorse extreme behaviors.
00:18:30.720 --> 00:18:33.000
They're more likely to take up arms.
00:18:33.086 --> 00:18:37.043
They're more likely to sacrifice, even
donating blood after the Boston bombing.
00:18:38.320 --> 00:18:40.360
This effect of fusion of personal
00:18:40.443 --> 00:18:45.871
and social identities,
that proposes a very specific cognitive
00:18:46.400 --> 00:18:52.920
architecture that I program into an agent
based model in order to try and understand
00:18:53.003 --> 00:18:56.886
how intergroup relations can
result in negative effects.
00:18:57.920 --> 00:19:02.680
So this is the model of mutually
escalating religious violence.
00:19:02.763 --> 00:19:07.043
The way that we started this was
by building up all of the different
00:19:07.129 --> 00:19:14.643
components that go into a human's mind
almost in a lego-like fashion.
00:19:14.800 --> 00:19:17.240
We took these different tendencies that we
00:19:17.320 --> 00:19:21.720
have, like the tendencies to be afraid or
the tendencies to believe,
00:19:21.803 --> 00:19:26.457
the tendencies to do things as a group,
the tendencies to have ritual,
00:19:27.000 --> 00:19:30.240
the tendencies to transmit
emotion and information.
00:19:30.320 --> 00:19:32.120
And so there's just thousands
00:19:32.203 --> 00:19:36.471
and thousands of lines of code
that go in the background here.
00:19:37.240 --> 00:19:38.760
We built these
00:19:38.843 --> 00:19:43.471
different algorithms for each one of these
tendencies and tried to use that to build
00:19:43.557 --> 00:19:49.197
our brain, and build a brain in such a way
that we could not just create one of them,
00:19:49.286 --> 00:19:53.729
like a single artificial intelligence,
but copy and paste it hundreds or
00:19:53.814 --> 00:19:55.957
thousands or millions
of times if we needed to.
00:19:56.040 --> 00:19:58.520
To create multi agent
artificial intelligence.
00:19:58.600 --> 00:20:00.600
We can simulate tens of thousands
00:20:00.680 --> 00:20:04.320
of incidences of violence,
tens of thousands of different conflicts,
00:20:04.403 --> 00:20:08.243
and study what is it about our
minds that are creating conflict?
00:20:23.957 --> 00:20:28.514
People look for religion as a cause,
and religion is really never the cause.
00:20:28.600 --> 00:20:30.240
They're looking for the easy explanation.
00:20:30.329 --> 00:20:35.700
And what we offer is a more complex,
but still computationally controllable
00:20:36.403 --> 00:20:39.229
perspective on what's going
on when it comes to religion.
00:20:40.440 --> 00:20:43.600
This model has potentially predictive
00:20:43.686 --> 00:20:47.757
capacities to understand the future
of conflicts, because what we're looking
00:20:47.843 --> 00:20:53.397
at are the causal mechanisms behind
religious and ethnic conflict.
00:20:53.486 --> 00:20:57.243
If you can isolate causal mechanisms
and the appropriate conditions under
00:20:57.329 --> 00:21:02.314
which religious and ethnic violence will
occur, you can, in principle, predict it.
00:21:05.400 --> 00:21:07.880
This approach, though, is, it's very new.
00:21:07.963 --> 00:21:10.371
I mean, we are the group
that's pioneering this.
00:21:10.680 --> 00:21:17.120
So this has repercussions for policy,
has repercussions for a military strategy.
00:21:17.200 --> 00:21:18.920
Maybe we need to rethink the fact that we
00:21:19.000 --> 00:21:22.720
just, you know, we're raining
bombs down from the sky on people.
00:21:22.800 --> 00:21:26.680
The people who are left over,
the brothers and the sisters of the family
00:21:26.760 --> 00:21:30.320
members who are killed through things like
drone strikes and other military actions
00:21:30.403 --> 00:21:33.271
are highly likely to become
terrorist individuals.
00:21:37.920 --> 00:21:39.760
When I look at these conditions,
00:21:39.843 --> 00:21:43.555
how frequently do they produce
periods of mutual escalation?
00:21:44.200 --> 00:21:46.320
Right. So it's almost
like a correlation.
00:21:46.403 --> 00:21:49.971
I was worried that somebody might say,
okay, well, you know, in the next week,
00:21:50.057 --> 00:21:53.186
how much mutually escalating religious
violence do you predict in the world?
00:21:53.800 --> 00:21:54.720
I have no idea.
00:21:54.760 --> 00:21:57.200
You know, they think of a weather
prediction, and they should.
00:21:57.280 --> 00:21:59.520
That's a real way to think of prediction.
00:21:59.600 --> 00:22:01.200
But the model doesn't do that.
00:22:01.286 --> 00:22:05.386
It just highlights conditions that create
mutually escalating religious violence.
00:22:06.371 --> 00:22:12.357
When I talk about predicting religious
extremism and using computer simulations,
00:22:12.443 --> 00:22:17.143
a lot of people think of the movie
minority report as if I'm saying that this
00:22:17.229 --> 00:22:20.237
individual at this location,
at this date and time,
00:22:20.320 --> 00:22:24.400
is going to commit an act of violence
and therefore, let's go arrest them.
00:22:24.486 --> 00:22:26.200
I don't think that that's possible.
00:22:26.640 --> 00:22:29.640
I think that when we're predicting
religious extremism,
00:22:29.729 --> 00:22:33.529
what we're doing is saying that given
a certain context and a certain set
00:22:33.614 --> 00:22:38.957
of people, the actions
that we see in this area are
00:22:39.040 --> 00:22:43.080
likely to give rise to higher levels
of radicalization and terrorism.
00:22:43.160 --> 00:22:44.960
You have to actually have eyes
00:22:45.040 --> 00:22:49.920
on the ground in order to see
and prevent the actual act of terrorism.
00:22:50.000 --> 00:22:51.920
The more realistic the virtual society
00:22:52.003 --> 00:22:56.414
that you build, the more realistic
the implementation of the policy is.
00:22:56.760 --> 00:22:58.760
Now, why would you do that?
00:22:58.800 --> 00:23:01.480
Why wouldn't you just try
a policy out in the real world?
00:23:01.560 --> 00:23:03.920
Because it's expensive and time consuming,
00:23:04.000 --> 00:23:08.200
and you have to use a lot of credibility
to even get a policy back to the point
00:23:08.286 --> 00:23:11.186
that it can be implemented
in legislation and in law.
00:23:11.720 --> 00:23:13.200
It's very expensive.
00:23:13.280 --> 00:23:14.560
It's very risky.
00:23:14.643 --> 00:23:19.814
It's better to try that out in a virtual
society first, whenever possible.
00:23:20.200 --> 00:23:21.760
So it's not that we've created
00:23:21.840 --> 00:23:25.400
a prediction machine,
but what we have created is a tool
00:23:25.480 --> 00:23:29.880
that can root policy decisions
in facts, data, and theory.
00:23:29.960 --> 00:23:32.040
And that that in itself is valuable
00:23:32.129 --> 00:23:36.929
because it can offer a near term
abstract predictive capability.
00:23:38.720 --> 00:23:44.640
Trying to take my scholarly work out
of the ivory tower and into the real world
00:23:44.720 --> 00:23:49.800
has always been important to me,
but important to me only in a certain way.
00:23:49.880 --> 00:23:56.400
Up until recently, until the recent years,
mostly the way I used to think of it was I
00:23:56.440 --> 00:23:59.760
would come up with a really good
theory by sitting in my ivory tower.
00:23:59.840 --> 00:24:04.400
I would try to write an article
which clarified how to think about moral
00:24:04.486 --> 00:24:07.986
issues, and then someone else would
take that and apply it to a moral issue.
00:24:08.800 --> 00:24:10.640
So not very practical at all, really.
00:24:10.720 --> 00:24:13.440
It was just more of a kind
of self justifying.
00:24:13.529 --> 00:24:17.057
Wouldn't it be nice if someone applied my
theory so I can stay in the ivory tower?
00:24:17.414 --> 00:24:22.157
The thing about trying to take
intellectual work and make it useful
00:24:22.240 --> 00:24:25.640
in the world is really
difficult for humanities people.
00:24:25.720 --> 00:24:27.760
You can spend your entire life studying
00:24:27.840 --> 00:24:32.040
this deposit of key literature
from the far past,
00:24:32.120 --> 00:24:36.320
and it may never have any impact
on anyone, practically speaking,
00:24:36.360 --> 00:24:39.400
in terms of the sharp social
problems that we try to deal with.
00:24:39.480 --> 00:24:41.560
You have to be smart about
how you choose your topics.
00:24:41.643 --> 00:24:45.271
But if you are smart,
you can actually find things that
00:24:46.520 --> 00:24:48.760
fundamental research can
make a difference on.
00:24:48.840 --> 00:24:51.360
And a classic example
is religious extremism.
00:24:51.440 --> 00:24:53.440
I want someone who was going to blow
00:24:53.520 --> 00:24:57.720
themselves up to decide not to blow
themselves up because they've got some
00:24:57.760 --> 00:25:00.680
other vision of the way life can be,
or because they read an article
00:25:00.763 --> 00:25:05.386
that convinced them that there are other
ways of protesting unjust situations.
00:25:07.080 --> 00:25:10.480
I have a greek colleague
from my university, Apostolos.
00:25:10.560 --> 00:25:12.280
He came to one of our public seminars
00:25:12.320 --> 00:25:16.040
about the modeling religion project
and heard us describing the way we apply
00:25:16.129 --> 00:25:19.143
simulation technology
to humanitarian problems.
00:25:19.600 --> 00:25:21.760
So Apostolos was planning a conference
00:25:21.840 --> 00:25:26.440
in Lesbos, Greece, an island that has been
hit especially hard by the refugee crisis.
00:25:26.520 --> 00:25:30.600
So Apostolos suggested that our team come
a couple of days early and actually visit
00:25:30.686 --> 00:25:35.157
the camps where the refugees stay
after crossing from Turkey into Greece.
00:25:35.829 --> 00:25:37.514
It worked out particularly well because
00:25:37.603 --> 00:25:40.943
some of our modelers had already
started working on a refugee model.
00:25:42.243 --> 00:25:46.914
So we decided to focus on refugees,
radicalization, and integration.
00:26:21.386 --> 00:26:24.057
How many people does this kind of boat hold?
00:26:24.143 --> 00:26:28.143
100 people.
100? But it's so small.
00:26:28.229 --> 00:26:31.400
No, it's not so small.
Where do they go? Do they go underneath?
00:26:31.500 --> 00:26:32.500
Inside
00:26:32.700 --> 00:26:35.971
They are all...stuffed against each other?
00:26:36.486 --> 00:26:37.600
Yes
00:27:10.714 --> 00:27:13.143
Summer of 2015.
00:27:13.555 --> 00:27:16.060
From about April to November,
00:27:16.960 --> 00:27:21.800
the number of refugees that were coming
across in these rubber rafts and some
00:27:21.880 --> 00:27:26.360
boats, rows and rows and rows,
and to peak it was about 10,000 a day.
00:27:26.445 --> 00:27:30.740
Basically one rubber raft with about
50 people every twelve minutes.
00:27:31.120 --> 00:27:32.600
The distance is only about 5 miles.
00:27:32.680 --> 00:27:35.520
You could see them coming
in one after the other.
00:27:35.560 --> 00:27:37.440
This little village has only about 800 or
00:27:37.525 --> 00:27:43.890
900 people, and so to try to take care
of that many people per day is staggering.
00:27:44.680 --> 00:27:46.160
And the woman who
00:27:46.240 --> 00:27:48.080
with whom we spoke yesterday, who
00:27:48.165 --> 00:27:50.105
well, she used the word tsunami.
00:27:51.200 --> 00:27:56.040
It's a kind of event that, you know,
is unimaginable, and nobody can,
00:27:56.125 --> 00:27:57.875
you almost,
they can't prepare for it.
00:27:58.160 --> 00:28:00.755
You just do what you
have to when it happens.
00:28:50.720 --> 00:28:53.080
Many people associate the recent refugee
00:28:53.165 --> 00:28:56.855
and immigration crises with the threat
of religious radicalization.
00:28:57.400 --> 00:29:01.355
But what role, if any, does
religion play in intergroup violence?
00:29:01.960 --> 00:29:03.560
And what policies, if any,
00:29:03.640 --> 00:29:07.640
can help facilitate the peaceful
settlement of peoples of different faiths
00:29:07.725 --> 00:29:10.275
into contexts that are
often quite secular?
00:29:17.429 --> 00:29:21.600
There's a complex but clear connection between
forced migration and religious violence
00:29:22.600 --> 00:29:25.040
Not only are refugees and asylum seekers
00:29:25.120 --> 00:29:29.360
often fleeing from violent religious
extremism, but a small percentage of their
00:29:29.445 --> 00:29:33.530
children grow up to perpetuate
religiously inspired terror attacks.
00:29:33.800 --> 00:29:35.760
Research shows that individuals affected
00:29:35.845 --> 00:29:40.770
by religious extremism and violence are
more likely to turn to radical ideologies.
00:29:41.400 --> 00:29:43.680
So we went to Greece to learn more about
00:29:43.760 --> 00:29:47.640
how refugees and their children are
impacted by religious extremism
00:29:47.725 --> 00:29:51.450
and whether this might increase their
vulnerability to being radicalized.
00:29:52.080 --> 00:29:57.315
The Syrian refugee crisis was really
predicated on a very specific
00:29:58.680 --> 00:30:01.080
set of circumstances
in the Middle East at the time.
00:30:01.165 --> 00:30:01.875
You know,
00:30:01.960 --> 00:30:06.400
they really created a situation
where ISIS was allowed to
00:30:06.480 --> 00:30:10.520
flourish in the power vacuums
that existed in the region,
00:30:10.605 --> 00:30:16.015
and it created a new group in the region
that made life unbearable for many people.
00:30:16.360 --> 00:30:20.920
They could no longer tolerate
life in that area.
00:30:21.000 --> 00:30:24.880
So understanding why it was that they
00:30:24.960 --> 00:30:28.840
wanted to move from Syria in
through Turkey to Greece, really,
00:30:28.920 --> 00:30:32.720
you have to understand
the nature of the religious landscapes
00:30:32.805 --> 00:30:35.395
and religious identities that are
interplaying in the region.
00:30:35.960 --> 00:30:40.200
What happens after the refugees
and asylum seekers land is complicated.
00:30:40.800 --> 00:30:43.080
Unfortunately, life in Europe and America
00:30:43.165 --> 00:30:46.880
isn't always easy, not for immigrants,
and even less so for their children.
00:30:54.640 --> 00:30:58.180
Second or third generation
immigrants who grew up in the EU.
00:30:58.640 --> 00:31:01.000
These are not the same
refugees we see on the news.
00:31:01.080 --> 00:31:05.040
Hundreds of people walking across
fields to try to get to safe territory.
00:31:05.125 --> 00:31:07.745
Rather, it is their
children and grandchildren.
00:31:08.440 --> 00:31:10.520
Many young Muslims of migrant origin,
00:31:10.600 --> 00:31:15.160
whether newly arrived or born in Europe,
develop a widespread feeling that they are
00:31:15.240 --> 00:31:17.760
not fully accepted as fellow citizens.
00:31:17.840 --> 00:31:20.000
They feel caught between two cultures,
00:31:20.085 --> 00:31:24.435
disenfranchised and alienated in a society
that does not fully accept them.
00:31:24.920 --> 00:31:26.800
The same goes for America.
00:31:26.880 --> 00:31:30.880
In fact, every jihadist who conducted
a lethal attack inside the United States
00:31:30.965 --> 00:31:34.410
since 9/11 was a citizen or legal resident.
00:31:34.800 --> 00:31:37.280
Many have been second
generation immigrants.
00:31:38.640 --> 00:31:42.880
The cell phone leads police to Rahami,
a naturalized US citizen
00:31:42.960 --> 00:31:48.640
born in Afghanistan, Rahami graduated from
Edison High School in New Jersey in 2007.
00:31:48.720 --> 00:31:51.360
Saipov, an immigrant from Uzbekistan,
00:31:51.440 --> 00:31:56.400
moved to the US in 2010 and held a green
card and a Florida driver's license.
00:31:56.480 --> 00:31:59.080
There's definitely a link between what
00:31:59.160 --> 00:32:02.720
we're doing with the artificial
intelligence to study religious violence
00:32:02.800 --> 00:32:06.480
and what we're doing to study
the refugee crisis in Lesbos.
00:32:06.560 --> 00:32:10.240
And the key to that connection is human
00:32:10.325 --> 00:32:13.040
identity and how we
identify as individuals.
00:32:13.360 --> 00:32:14.600
We're dealing with an artificial
00:32:14.680 --> 00:32:17.920
intelligence system that is
modeling our identity.
00:32:18.005 --> 00:32:21.880
We're applying the models of our
identity to understand the identities
00:32:21.965 --> 00:32:23.865
of refugees in Lesbos.
00:32:23.950 --> 00:32:27.990
A refugee is, by nature, someone who holds a
00:32:28.190 --> 00:32:30.985
specific identity, the identity
of their home culture
00:32:31.070 --> 00:32:33.560
of their home country, and
they're trying to get to a place
00:32:33.865 --> 00:32:39.040
where they are going to be, almost
surely a minority amongst people
00:32:39.125 --> 00:32:42.520
who hold different ideas, different
beliefs, and different identities.
00:32:42.920 --> 00:32:45.600
How are they interacting
with the local communities there?
00:32:45.680 --> 00:32:47.840
How are they assimilating or integrating
00:32:47.920 --> 00:32:51.840
into European culture or wherever
they decide to make their new homes?
00:32:51.920 --> 00:32:55.520
These are going to be questions
that are going to need to be answered.
00:32:55.600 --> 00:32:57.520
Politicians in America and Europe have
00:32:57.605 --> 00:33:01.055
capitalized on the easy association
between refugees and terrorism,
00:33:01.480 --> 00:33:04.505
successfully polarizing
populations over the issue.
00:33:04.943 --> 00:33:07.771
But the research tells a more complicated story
00:33:07.971 --> 00:33:13.371
about how the refugee crisis is the result of and
the precursor to religious extremist violence
00:33:14.080 --> 00:33:16.240
The more we can understand the complex web
00:33:16.320 --> 00:33:20.240
of factors involved, the more effective we
can be in promoting inclusion,
00:33:20.325 --> 00:33:24.450
social cohesion, education,
and integration among diverse peoples.
00:33:26.960 --> 00:33:28.520
And so when we're understanding what's
00:33:28.600 --> 00:33:32.520
going on in Lesbos, we have to not just be
thinking about the people who are
00:33:32.600 --> 00:33:35.320
currently on those boats, the people
who are currently on the island
00:33:35.400 --> 00:33:38.480
or in the refugee camps,
but that in years to come,
00:33:38.560 --> 00:33:42.400
they're going to be settled throughout
Europe and they're going to have homes
00:33:42.440 --> 00:33:43.880
there and they're going
to be raising kids there.
00:33:43.960 --> 00:33:47.960
And how is it that we can help to better
00:33:48.045 --> 00:33:54.335
integrate them so that they can maintain
their identity as Syrians or as Muslims
00:33:54.970 --> 00:34:00.195
or as Middle Eastern Christians,
and also help their children to feel
00:34:00.280 --> 00:34:04.240
at home so that they can also have
an identity that is Norwegian,
00:34:04.320 --> 00:34:09.120
that is British, so that when they grow
up, they're not going to have an animosity
00:34:09.205 --> 00:34:14.450
towards the country and towards the people
that they want to feel at home with.
00:34:15.800 --> 00:34:21.210
The refugee crisis in Lesbos is
quite developed and it's quite acute.
00:34:21.640 --> 00:34:25.570
Some of the refugees from here
wind up going to Norway, so we can
00:34:26.120 --> 00:34:30.640
be here and learn something about refugees
in general, and at the same time build
00:34:30.725 --> 00:34:34.600
models that are specific to particular
countries, such as Norway.
00:34:35.600 --> 00:34:40.040
I look around, I'm not sure that we're
there yet, because I see some buildings,
00:34:40.125 --> 00:34:44.190
some tents, and I'm looking around,
and I'm not sure we're there.
00:34:44.329 --> 00:34:46.750
Is this the camp or not?
It has to be.
00:34:47.136 --> 00:34:50.211
It doesn't look like this doesn't
look like what I expected.
00:34:53.357 --> 00:34:58.614
We're walking up. Apostolos
our host is introducing us
00:34:58.800 --> 00:35:01.314
to the woman who runs the camp.
00:35:02.960 --> 00:35:04.480
She's telling us,
00:35:04.563 --> 00:35:09.286
she's pointing to different buildings
and showing us the volunteers.
00:35:09.486 --> 00:35:12.614
I can just give you an introduction about the place.
00:35:13.643 --> 00:35:15.971
Pikpa started in 2012.
00:35:16.700 --> 00:35:18.857
It was an abandoned summer camp for children.
00:35:19.057 --> 00:35:23.757
Because of the crisis, they couldn't work here.
00:35:23.957 --> 00:35:27.314
So the place was empty for some years.
00:35:27.800 --> 00:35:31.543
The main problem for us is that there is no strategy,
00:35:31.700 --> 00:35:34.829
and everything is changing from one day to the other.
00:35:34.914 --> 00:35:40.829
It is very difficult for the people that get
psychologically affected by this insecurity.
00:35:41.029 --> 00:35:43.614
We have thousands of people arriving.
00:35:43.814 --> 00:35:47.014
From May on, we have an increase
of 500 people per day,
00:35:47.371 --> 00:35:51.771
and then we go to 3,000
and even 8 to 10,000 per day.
00:35:53.200 --> 00:35:55.480
The volunteers have come from different
00:35:55.560 --> 00:36:00.000
places all over the world,
some from Greece, from the islands.
00:36:00.086 --> 00:36:02.014
Some of them are themselves refugees.
00:36:02.657 --> 00:36:06.643
We had people that stayed only,
in 2015, only two or three days.
00:36:07.057 --> 00:36:09.400
Now it is some months, the minimum.
00:36:09.680 --> 00:36:12.800
She's leading us into this building,
00:36:12.886 --> 00:36:21.014
and to the right are rows and rows
of children's clothes, piled boxes,
00:36:22.280 --> 00:36:29.360
shoes, hats, boxes that list children's,
girls dresses, age six, and so forth.
00:36:29.443 --> 00:36:30.700
As far as you can see.
00:36:56.840 --> 00:37:00.320
What I've seen recently has affected
00:37:00.403 --> 00:37:07.143
the way I think about the situation
and how to frame the problem.
00:37:08.200 --> 00:37:09.640
I know there's something for me to do.
00:37:09.720 --> 00:37:12.600
I just don't know what it is,
and I can't figure it out.
00:37:12.686 --> 00:37:18.343
And I don't even know if I have
enough distance or enough
00:37:19.000 --> 00:37:22.520
understanding and maturity
to really process what I'm saying.
00:37:22.600 --> 00:37:29.480
So I try to be careful not to
make huge leaps and claims about the way
00:37:29.560 --> 00:37:31.800
it changed, the way I
would approach things.
00:37:31.886 --> 00:37:34.114
That's why we have the experts after all.
00:37:35.754 --> 00:37:38.960
With the original culture, with alpha.
00:37:39.000 --> 00:37:41.040
Yeah.
Host is here again.
00:37:41.120 --> 00:37:41.520
Okay.
00:37:41.600 --> 00:37:44.320
We're working primarily
on four different models.
00:37:44.400 --> 00:37:47.960
Each one deals with a particular
phase of the migration of peoples.
00:37:48.040 --> 00:37:50.520
The first model is the refugee process.
00:37:50.603 --> 00:37:54.300
We have more scholars here working
on that model with two different teams.
00:38:01.560 --> 00:38:05.800
We heard from some crisis volunteers
that the influx of refugee arrivals was so
00:38:05.886 --> 00:38:09.700
overwhelming that many locals became
stressed and conflicted among themselves.
00:38:10.114 --> 00:38:12.329
Ethnic stereotypes and even xenophobia
00:38:12.414 --> 00:38:16.197
came into play with some of the local
hotel owners and municipalities.
00:38:16.280 --> 00:38:18.720
That's one of the central
challenges of integration,
00:38:18.800 --> 00:38:21.360
diffusing racial and ethnic stereotypes.
00:38:21.560 --> 00:38:23.600
When they go into a new population,
00:38:23.680 --> 00:38:26.200
if there are these prejudices already
00:38:26.280 --> 00:38:29.400
built in to the local communities
that they live in,
00:38:29.440 --> 00:38:32.000
then they're going to have a hard
time living in those communities.
00:38:32.086 --> 00:38:34.443
The model that my group is working on
00:38:34.671 --> 00:38:37.914
concerns the last stage
of the integration process.
00:38:38.000 --> 00:38:43.760
What we're interested in is the mechanism
by which people who are born
00:38:43.843 --> 00:38:49.086
from a different culture, a minority
culture, integrate into a country.
00:38:49.640 --> 00:38:51.800
It's important to realize that we're not
00:38:51.880 --> 00:38:56.320
dealing with issues that will be settled
within a matter of years,
00:38:56.403 --> 00:39:00.614
but rather decades, and,
dare I say it, even centuries.
00:39:00.960 --> 00:39:03.280
So these are very long term problems.
00:39:03.360 --> 00:39:05.520
And that's one of the things that makes
00:39:05.600 --> 00:39:10.360
simulation of interest, because
we can experiment with artificial
00:39:10.440 --> 00:39:16.440
societies and look at them over a period
of decades, or indeed centuries.
00:39:16.529 --> 00:39:20.400
And we can do so with computer runs
that may last less than a minute.
00:39:46.557 --> 00:39:48.957
Can you imagine?
They can get 100 people in there.
00:39:50.920 --> 00:39:52.040
Impossible.
00:39:52.120 --> 00:39:54.000
There must be people underneath, too.
00:39:54.080 --> 00:39:55.720
Oh, yeah, yeah.
00:39:55.803 --> 00:40:02.014
They're jammed in skin to skin.
00:40:04.240 --> 00:40:06.360
I mean, not only is there a gigantic,
00:40:06.440 --> 00:40:13.480
immense mound of life jackets, torn,
ripped, some, okay, a lot of them torn up.
00:40:13.520 --> 00:40:17.040
Some of them were faked life jackets,
like they put grass in them to make them
00:40:17.080 --> 00:40:19.640
look as if they were life jackets,
but they really weren't.
00:40:19.729 --> 00:40:21.743
I mean, the scale of it is hard to grasp.
00:40:22.800 --> 00:40:25.720
So I'm like, I'm six foot two
00:40:25.800 --> 00:40:28.280
and I'm looking like,
dead in the middle of that pile.
00:40:28.360 --> 00:40:30.480
So, I mean, they have to be piled at least
00:40:30.563 --> 00:40:34.471
12, 13ft high here,
and that's a smaller pile.
00:40:52.000 --> 00:40:55.014
We've got people praying and
singing praise songs over here...
00:41:06.080 --> 00:41:08.080
My place and my safety.
00:41:08.160 --> 00:41:10.080
He is my God and I trust him.
00:41:10.163 --> 00:41:12.371
You know, in Portugal, you have a saying
00:41:13.557 --> 00:41:16.571
which is "Olhos não
vêem, coração que não sente"
00:41:17.143 --> 00:41:22.214
meaning what the eye doesn't see,
the heart doesn't feel.
00:41:22.886 --> 00:41:32.600
I have, in fact, just one single feeling,
which is, I want what I do to be useful,
00:41:33.800 --> 00:41:39.314
if not for policies, at least
for understanding or for something else.
00:41:40.400 --> 00:41:43.760
The Lord says, I will
rescue those who love me.
00:41:43.840 --> 00:41:46.480
I will protect those who trust in my name.
00:41:46.563 --> 00:41:49.586
When they call on me, I will answer.
00:41:50.043 --> 00:41:52.443
I will be with them in their trouble.
00:41:52.800 --> 00:41:54.920
I will rescue and honor them.
00:41:55.003 --> 00:42:01.114
I will renew them with a long
life and give them my salvation.
00:42:02.086 --> 00:42:06.857
Keep in mind that there is a distance
00:42:07.800 --> 00:42:12.700
between us and these people
that will never be shortened.
00:42:22.760 --> 00:42:25.000
I've seen a lot,
00:42:25.080 --> 00:42:29.040
enough pain and suffering, not in myself,
in other people around,
00:42:29.120 --> 00:42:34.000
that you just have to do something. Come
in looking, okay, taking pictures,
00:42:34.086 --> 00:42:41.014
and it felt a little bit touristy,
and it's like tourism of misery and pain.
00:42:42.280 --> 00:42:43.880
Not interesting to me.
00:42:43.960 --> 00:42:46.560
So if you want to do something,
let's do something.
00:42:46.640 --> 00:42:48.760
And if we're going to do something now,
00:42:48.843 --> 00:42:55.129
as a human, let's do something meaningful,
let's do something that means something.
00:42:55.640 --> 00:43:03.240
It's never entirely clear how realistic
these models are,
00:43:03.320 --> 00:43:07.480
but if it's oversimplified,
then it doesn't seem realistic.
00:43:07.560 --> 00:43:14.960
And people, either lay people or policy
people, can quite rightly object that it
00:43:15.043 --> 00:43:19.243
doesn't correspond to the complex
reality of the situation.
00:43:21.440 --> 00:43:24.120
It's hard for an academic to come to terms
00:43:24.200 --> 00:43:28.480
with how little influence we
actually have as researchers.
00:43:28.560 --> 00:43:30.000
You know, we get these big grants,
00:43:30.086 --> 00:43:34.229
we do this cool research, we get the big
publications, and nothing ever happens.
00:43:34.640 --> 00:43:36.960
Actually having even the smallest amount
00:43:37.040 --> 00:43:40.880
of personal experience of, you know,
actually being in Lesbos or holding one
00:43:40.880 --> 00:43:44.600
of the life jackets of a child that was
not going to work when that boat capsized.
00:43:44.686 --> 00:43:47.143
That life jacket did not
save that child's life.
00:43:47.680 --> 00:43:51.000
And knowing that you're actually holding
the one key that could have helped save
00:43:51.080 --> 00:43:54.400
that life, you kind of realize
that actually you have a skill set
00:43:54.480 --> 00:43:58.480
and you've been trained to think about
things and to come up with solutions
00:43:58.563 --> 00:44:00.957
that actually could
save a child's life.
00:44:01.920 --> 00:44:03.640
It's much more important, really,
00:44:03.680 --> 00:44:06.600
to kind of, you know, get out
of that ivory tower and actually
00:44:06.640 --> 00:44:10.600
deal with real problems and real issues
and just sitting around and debating them
00:44:10.680 --> 00:44:15.320
and talking about them because,
you know, you did a PhD in Foucault is.
00:44:15.403 --> 00:44:16.571
It's a bit.
00:44:19.643 --> 00:44:20.814
It's arrogant.
00:44:43.640 --> 00:44:45.360
So now it's the end of the summer.
00:44:45.440 --> 00:44:48.320
It's August, and we're
in Kristiansand, Norway.
00:44:48.400 --> 00:44:50.120
It's been three months since we were all
00:44:50.203 --> 00:44:54.586
together at Lesbos, and every one
of the teams has made terrific progress.
00:44:55.080 --> 00:44:56.560
They'll be arriving soon,
00:44:56.643 --> 00:45:00.857
and we have space set up for them
here in Sørlandet kunnskapsperk.
00:45:01.286 --> 00:45:03.300
Here in Kristiansand at
the University of Agder,
00:45:03.386 --> 00:45:06.237
we've got the time to be
working together for two days.
00:45:06.320 --> 00:45:09.071
We are doing another workshop where all
00:45:09.157 --> 00:45:12.437
of the different teams are getting
together to try and keep working on their
00:45:12.520 --> 00:45:17.586
models of radicalization,
refugee integration, immigration,
00:45:18.686 --> 00:45:22.557
and hopefully we're going to try and
finish some of them off while we're here.
00:45:24.486 --> 00:45:28.457
Here in an academic bubble,
you're sheltered from that,
00:45:28.543 --> 00:45:30.957
you've developed a common
language of understanding.
00:45:31.040 --> 00:45:33.640
You're starting to see results that
00:45:33.720 --> 00:45:36.600
match what you expected
to see or start to match data.
00:45:36.680 --> 00:45:39.640
And so you believe in the
validity of your model.
00:45:39.729 --> 00:45:43.400
And then you probably think it can answer
even more questions than it probably can.
00:45:48.840 --> 00:45:50.960
We're here today trying to make sure
00:45:51.000 --> 00:45:53.120
that the room is ready
for our presentation tomorrow.
00:45:53.203 --> 00:45:55.671
We have a lot of people from the
university that are gonna come.
00:45:55.760 --> 00:45:57.720
We have policymakers
that are supposed to come.
00:45:57.800 --> 00:46:02.160
We've invited people from different
government agencies in Oslo to come down.
00:46:02.200 --> 00:46:05.800
So this is the first time that we're gonna
be presenting as a team to not just local,
00:46:05.840 --> 00:46:07.640
but also national
government officials, too.
00:46:07.729 --> 00:46:08.829
So that'll be cool.
00:46:08.914 --> 00:46:10.071
Can you hear me okay?
00:46:10.157 --> 00:46:12.237
Not too loud.
Okay.
00:46:12.320 --> 00:46:16.157
Welcome to the Refugees, Religion,
and Radicalization conference.
00:46:16.480 --> 00:46:19.040
Actually, we have a really great agenda,
00:46:19.120 --> 00:46:23.440
I think, and a panel with six of our
experts on the experience of using
00:46:23.520 --> 00:46:26.440
computer models for scientific
study of religion.
00:46:26.520 --> 00:46:28.240
I'm going to try to present to them what
00:46:28.320 --> 00:46:32.200
models can do, why we're
enthusiastic about the methodology.
00:46:32.280 --> 00:46:34.000
The goal from the beginning for this
00:46:34.080 --> 00:46:39.040
particular public event, this colloquium,
was to try to show to the university
00:46:39.120 --> 00:46:43.840
community here in my local university
in Kristiansand and policymakers
00:46:43.920 --> 00:46:47.720
and collaborators we have here
in the region, and a few colleagues
00:46:47.800 --> 00:46:51.096
who came from other places
in Oslo and Europe
00:46:51.186 --> 00:46:53.643
to show them what modeling could do.
00:46:53.729 --> 00:46:58.437
The way in which this particular
methodology can link together theories
00:46:58.520 --> 00:47:03.760
and show possible causal connections
and lead to policy implications in a way
00:47:03.843 --> 00:47:06.814
that no other particular
methodology can do.
00:47:07.957 --> 00:47:09.329
Questions?
00:47:12.686 --> 00:47:17.386
Can we predict and prevent religious radicalization?
00:47:17.586 --> 00:47:21.200
Of course, the answer to that question
depends on the meaning of every word
00:47:21.286 --> 00:47:22.400
in the question, right?
00:47:22.600 --> 00:47:26.014
What do we mean by predict, prevent,
and religion, and radicalization...
00:47:26.214 --> 00:47:27.371
and "we,"
00:47:27.471 --> 00:47:28.514
and "can."
00:47:28.714 --> 00:47:33.829
Policymakers ask, "What social policies, if any,
can reduce violent extremism?"
00:47:34.029 --> 00:47:36.729
Sometimes, there's a sense of giving up.
00:47:36.929 --> 00:47:39.386
Is there any way to stop
the violence that continues?
00:47:39.471 --> 00:47:40.914
Charlottesville, Barcelona,
00:47:41.114 --> 00:47:43.100
right, it just continues,
over and over and over.
00:47:43.300 --> 00:47:45.514
What policies could possibly help?
00:47:45.714 --> 00:47:50.200
But to come up with plausible arguments that
we've identified at least some of the mechanisms
00:47:50.929 --> 00:47:56.214
under which, in particular conditions,
we do have a high probability of radicalization.
00:47:56.714 --> 00:47:59.929
But I do think that it can
help us get a clearer view
00:48:00.129 --> 00:48:03.114
of the conditions under which
radicalization is likely to occur.
00:48:03.314 --> 00:48:09.029
And, ideally, inform the policy discussions
about these important issues.
00:48:16.160 --> 00:48:17.760
We worked on the model of mutually
00:48:17.840 --> 00:48:20.760
escalating religious
anxiety for a few years.
00:48:20.843 --> 00:48:22.614
It told us that social groups are more
00:48:22.700 --> 00:48:27.114
likely to get into intergroup conflict
when they feel threatened by contagion,
00:48:27.200 --> 00:48:31.771
such as in a pandemic, or by the presence
of social groups different from their own.
00:48:32.280 --> 00:48:36.520
They're much less affected in terms
of intergroup conflict in relation
00:48:36.603 --> 00:48:40.457
to their anxieties connected
to natural hazards or violent crime.
00:48:42.240 --> 00:48:45.080
We found that actually, in this
00:48:45.160 --> 00:48:51.640
particular bit of research that, oh, well,
once you start to get like 60/40 splits
00:48:51.720 --> 00:48:54.480
in populations, things
start to get more violent.
00:48:54.560 --> 00:48:57.120
Populations in which there's a about 60/40
00:48:57.200 --> 00:49:02.000
or 70/30 division between majority
and minority groups are the most
00:49:02.086 --> 00:49:05.414
susceptible to mutually
escalating intergroup conflict.
00:49:06.120 --> 00:49:08.640
This can be useful to policy professionals
00:49:08.720 --> 00:49:13.320
because they can then especially pay
attention to populations in which there is
00:49:13.403 --> 00:49:16.657
that level of division between
majority and minority groups.
00:49:18.443 --> 00:49:21.443
We've completely fulfilled
all of our promises.
00:49:21.529 --> 00:49:24.514
In fact, we've over delivered
on all of the fronts.
00:49:24.600 --> 00:49:30.000
We produced more than we promised,
but we've given tons of papers and tons
00:49:30.080 --> 00:49:33.800
of panel presentations,
and we've been really warmly received
00:49:33.880 --> 00:49:36.040
by people in the scientific
study of religion.
00:49:36.120 --> 00:49:37.840
I'm one of the people.
I wanted the influence.
00:49:37.929 --> 00:49:39.986
I wanted my life to be more satisfying.
00:49:40.560 --> 00:49:41.800
So now I can.
00:49:41.886 --> 00:49:44.186
I can take on an issue like
00:49:45.486 --> 00:49:50.743
radicalization and extremist religious
violence, build computer models,
00:49:51.350 --> 00:49:55.100
use what I know from religious studies
and philosophy of religion to talk about
00:49:55.183 --> 00:49:58.917
the theories and try and come up
with a deeper understanding of them.
00:49:59.000 --> 00:50:00.758
That's what I call fundamental research.
00:50:01.800 --> 00:50:03.560
But I can also make a difference
00:50:03.640 --> 00:50:10.680
in the world practically by working
on problems that are immediately.
00:50:10.767 --> 00:50:12.700
The solutions are
immediately applicable to
00:50:12.783 --> 00:50:13.725
people in the world.
00:50:26.720 --> 00:50:29.520
The Boston Marathon bombing in 2013 is
00:50:29.603 --> 00:50:32.867
an example of how complicated
religious violence can be.
00:50:33.358 --> 00:50:36.817
It showed how you can't blame
it on any one simple cause.
00:50:37.600 --> 00:50:39.960
A young male, second generation immigrant
00:50:40.040 --> 00:50:43.440
with mental illness,
feeling disenfranchised from the host
00:50:43.520 --> 00:50:47.840
culture, radicalized online,
further radicalized on a trip to his home
00:50:47.925 --> 00:50:51.658
country, abandoned by his parents,
recruited his little brother.
00:50:52.242 --> 00:50:56.592
This was a complex interplay of factors
culminating in religious violence.
00:50:59.400 --> 00:51:02.640
So it's not just in England or Norway
00:51:02.720 --> 00:51:07.480
where you start to see this pattern
of second generation immigrants going
00:51:07.520 --> 00:51:09.680
through this radicalization
and identity crisis.
00:51:09.760 --> 00:51:12.120
You see this in Germany,
you see this in Belgium,
00:51:12.203 --> 00:51:14.567
you see this in France,
you've seen this in the United States.
00:51:16.150 --> 00:51:18.475
I'm establishing new vetting to keep
00:51:18.558 --> 00:51:23.275
radical Islamic terrorists out
of the United States of America.
00:51:23.358 --> 00:51:24.442
We don't want them here.
00:51:25.000 --> 00:51:28.880
I think it's the case that unless
the United States and other nations
00:51:28.971 --> 00:51:34.086
throughout Europe and North America
address how it is that they integrate
00:51:34.960 --> 00:51:40.440
immigrants in the coming years, this is
likely to be a pattern that will increase.
00:51:40.520 --> 00:51:44.680
And that's something that is backed up
by our models, and it's backed up by a lot
00:51:44.767 --> 00:51:47.608
of other experts that aren't using
computer modeling and simulation.
00:51:50.517 --> 00:51:52.533
Yeah, the only point is
to make a difference,
00:51:52.617 --> 00:51:53.742
is to solve problems.
00:51:55.067 --> 00:51:57.933
That's me though, I know
I'm a bit weird that way.
00:51:58.242 --> 00:52:00.092
Other people really love
the togetherness stuff.
00:52:00.175 --> 00:52:03.833
You know, let's hold hands and watch
the Titanic go down, you know.
00:52:04.150 --> 00:52:07.197
I'd rather stop the bloody
Titanic from going down.
00:52:07.283 --> 00:52:09.675
And if they'd simulated
it hitting an iceberg,
00:52:10.240 --> 00:52:13.960
what you would see is water filling up
and the whole thing sinking and they never
00:52:14.043 --> 00:52:15.925
would have let it go
out of that Irish port.
00:52:17.000 --> 00:52:17.825
Nuts.
00:52:22.960 --> 00:52:28.200
There is a risk that this project won't
have the outputs that we desire,
00:52:28.283 --> 00:52:30.908
that it won't have the impact
that we're looking for.
00:52:31.720 --> 00:52:35.200
But if factors that still make bad dudes
00:52:35.280 --> 00:52:39.240
in the world persist, we
can't really solve those factors.
00:52:39.280 --> 00:52:40.760
We can just identify where they are
00:52:40.840 --> 00:52:44.080
and tell people they should put resources
towards attempting to reduce them.
00:52:44.160 --> 00:52:48.880
But even then you can't
eradicate poverty in the world.
00:52:48.960 --> 00:52:49.680
It's not going to happen.
00:52:49.740 --> 00:52:53.240
You can't take away hunger and you can't,
you can't take away violence.
00:52:53.280 --> 00:52:54.920
You can try and create conditions where
00:52:55.000 --> 00:53:00.480
it's minimized to some extent,
but if you tie yourself to those outcomes
00:53:00.567 --> 00:53:02.917
you're gonna be an
emotional wreck and sad.
00:53:04.200 --> 00:53:05.960
We have to over promise to get more money
00:53:06.040 --> 00:53:11.240
but we can't over promise
and have people pissed that we didn't
00:53:11.320 --> 00:53:14.480
solve problems and we
can't solve all the problems.
00:53:14.567 --> 00:53:16.742
But we're doing a better job than,
00:53:17.240 --> 00:53:20.040
I mean, we're doing a better job
than somebody's not working on it.
00:53:20.125 --> 00:53:21.233
I mean, that's for sure.
00:53:26.717 --> 00:53:30.642
When I'm with friends who, like me,
are humanities scholars
00:53:31.917 --> 00:53:38.567
we'll sit down over drinks and I'll ask
them, what does our life really add up to?
00:53:38.960 --> 00:53:40.400
What are we really doing?
00:53:40.480 --> 00:53:42.200
Because the world's in trouble.
00:53:42.280 --> 00:53:46.520
The world in some ways is better than it's
ever been, but there are such terrible
00:53:46.603 --> 00:53:51.142
dangers and we've also got the capacity
to make change that we're not making.
00:53:51.560 --> 00:53:54.480
So what do the humanities people sitting
00:53:54.567 --> 00:53:58.500
in their humanities caves like me,
what do we do about that?
00:53:59.200 --> 00:54:03.517
How can we actually make a difference
in real people's lives right now?
00:54:03.600 --> 00:54:05.783
Not just by teaching them
in the classroom,
00:54:05.925 --> 00:54:07.917
but by solving practical problems?
00:54:08.000 --> 00:54:09.050
Can we or not?
00:54:09.680 --> 00:54:14.560
And I think the answer there is that we
can't make a real difference by ourselves.
00:54:14.640 --> 00:54:16.640
We need to team up with people.
00:54:16.720 --> 00:54:18.960
We need to be smart about
the way we build teams.
00:54:19.000 --> 00:54:22.200
So we need to work with engineers and we
need to work with change agents and we
00:54:22.283 --> 00:54:25.125
need to work with NGO's and we
need to work with policy makers.
00:54:25.600 --> 00:54:29.800
All of these people are not us
and we can't do what they can do.
00:54:29.880 --> 00:54:31.720
And we can't have change.
00:54:31.800 --> 00:54:33.640
We can't be involved in making change
00:54:33.720 --> 00:54:36.080
unless we're making it
with those partnerships.
00:54:36.167 --> 00:54:37.150
In those partnerships.
00:54:38.240 --> 00:54:40.560
I really tried my heart out.
00:54:40.640 --> 00:54:42.800
I really tried to make a difference.
00:54:42.883 --> 00:54:44.117
I did my best.
00:54:44.680 --> 00:54:48.120
And so I'm going to be able to look
at myself in the mirror and think
00:54:48.200 --> 00:54:51.560
something other than you could
have, but you didn't try.
00:54:51.643 --> 00:54:53.233
I'm not going to have that thought.
00:59:36.000 --> 00:59:37.483
Good boy.
00:59:38.083 --> 00:59:39.250
Yes.
00:59:40.442 --> 00:59:41.108
Hey!
00:59:43.840 --> 00:59:45.040
Yeah.
00:59:45.120 --> 00:59:46.160
Oh, heavens.