Keynote: Programming Complexity - Modeling Complex Systems with Ruby and React

Summarized using AI

Keynote: Programming Complexity - Modeling Complex Systems with Ruby and React

Sau Sheong Chang • June 23, 2016 • Singapore • Keynote

In the keynote presentation "Programming Complexity - Modeling Complex Systems with Ruby and React" by Sau Sheong Chang at the Red Dot Ruby Conference 2016, the speaker delves into the theme of programming complexity by discussing various models that illustrate complex systems and behaviors.

Key Points Discussed:

- Introduction to Complexity: Chang defines complexity as the emergent behavior from a group of interacting components that cannot be solely attributed to the properties of individual parts. He uses the example of flocking birds to illustrate how simple rules lead to complex patterns.

- Modeling Cultural Interactions: Chang introduces a simulation that models cultural interactions based on Robert Axelrod's research. He discusses how cultures tend to become more alike over time through interactions, highlighting the simulation's ability to show how culture evolves in a shared environment.

- Racial Segregation Simulation: A significant discussion focused on racial segregation using a model inspired by Thomas Schelling's work on the topic. Chang explains how individual preferences can lead to segregation even in diverse communities. He runs simulations demonstrating how neighborhoods can become polarized based solely on residents’ preferences.

- Bystander Effect and Volunteering Dilemma: Chang illustrates the bystander effect through statistical simulations based on historical events. He presents a mathematical analysis of volunteering behavior in situations where collective action is necessary, showing that as the number of witnesses increases, individual willingness to act may decrease.

- Commonalities in Simulations: At the end of his talk, Chang highlights the common coding strategies used across the different simulations and addresses audience questions about the implications of these models.

Conclusions and Takeaways:

- Emergent behaviors in complex systems can lead to unexpected outcomes, particularly in cultural and social contexts.

- The dynamics of cultural interactions illustrate that cultures can share and adapt traits over time, affecting community identity.

- The implications of segregation models suggest that individual preferences contribute to broader social patterns, thus emphasizing the importance of community policies in mitigating segregation.

- Understanding the bystander effect provides insight into how group dynamics can influence personal responsibility in emergencies.

The talk encourages attendees to engage with the simulations available on GitHub, promoting hands-on learning for better grasping these complex behavioral models.

Keynote: Programming Complexity - Modeling Complex Systems with Ruby and React
Sau Sheong Chang • June 23, 2016 • Singapore • Keynote

Speaker: Sau Sheong Chang, Managing Director, Digital Technology, Singapore Power

Event Page: http://www.reddotrubyconf.com/

Produced by Engineers.SG

Help us caption & translate this video!

http://amara.org/v/ONqU/

Red Dot Ruby Conference 2016

00:00:15.080 hello okay hi everyone thanks thanks for
00:00:24.349 staying back i think i still see a lot of people i wasn't expecting that many people by this time but thank you for
00:00:29.779 tuning up my name is Sasha and I'm going to talk about programming complexity
00:00:35.059 today about modeling complex systems with Ruby and react so a few years ago I
00:00:41.150 talked about something very similar and you've been to that top you can consider this to be a part 2 of the city hall but
00:00:50.089 to begin with I just want to show this slide so Winston this is for you so this
00:00:58.070 is the first post i did in 2006 right so when we first created the Singapore Ruby
00:01:04.850 Brigade cuoco group and Chunkin was the one who created the group and is he is
00:01:11.330 chuck it here or you left okay oh yeah right so he created the group
00:01:19.219 and I quickly stripped him and conquered the first post so now you know what kind
00:01:25.490 of friend I am right um anyway general a
00:01:30.759 total of it by myself being the nasa since I am so my name is Sasha this is
00:01:36.259 my email since how I can reach me and since we're down memory lane I have actually been to a number of radar ruby
00:01:44.149 conference since 2001 so I've actually spoken since the first one till today
00:01:50.209 except for 2014 oh so this is my fifth time here on the stage and I am totally
00:02:00.409 overwhelmed by the emotions and not run believe he's kidding so i recently
00:02:07.069 changed jobs so i was personally from paypal and i recently changed to join a
00:02:13.250 utilities company so this is a company that is government-linked and provides
00:02:18.709 power and gas and water to do a whole singapore so if you're interested to
00:02:25.250 know why I make this really drastic change you can come and push me later on but anyway I have joined this company
00:02:31.609 and their this I've been doing Ruby for a very long time I suppose 11 years now
00:02:38.329 and counting so and along with journey I actually wrote three books and the most
00:02:44.420 recent book however is one on go so that is that as well it's supposed to be
00:02:49.700 released next month so if you're into web programming says my pitch for you to buy my book
00:02:56.489 anyway so I think a number of people talked about really in-depth technical
00:03:04.000 topics today and I just want to say that this is not one of them my presentation
00:03:09.400 actually will show you no code but i do have coats so if you want to see it later you can actually go into this
00:03:15.720 github repository and you can check out the code I can also show it to you if
00:03:21.129 like enough you push out and clap I will show it to you on my text wait which is
00:03:27.970 what i use anyway let me start with the the topic I want to talk about today
00:03:34.290 basically I want to talk about complexity complexity what is it is not
00:03:40.120 just about having systems that are very complicated and very difficult to understand this actually has a quite a
00:03:46.540 definite term it is about the behavior that images from a group of interacting parts but it's not directly the result
00:03:53.590 of interactions which in those individual parts basically something that comes out of individual parts but
00:04:00.819 it doesn't look as if it is from those hearts let me just illustrate very
00:04:07.540 quickly with a familiar example so you have seen like flocking Birds so each
00:04:14.109 bird actually follow a very simple would say behavior they will just follow the
00:04:20.349 bird in front of it you will try not to crash into the bird and we generally move in the direction that the bird is moving but the resultant a swarm of
00:04:28.840 birds that you see it's something that is not what you expect from just the birds following a few simple rules in
00:04:34.870 the same way a schooling group of fish has the same behavior so this this is
00:04:42.039 what i mean by complexity certain behavior arising from things individual
00:04:47.349 things that are happening that you don't expect and what I'm trying to do today in this talk is talk about
00:04:53.420 certain types of complexity problems and those problems have really been model so
00:05:00.440 what I'm going to talk about today is not something new all together some of it actually prio in fact as well as one
00:05:09.130 topic time ago talk about is was actually written about before I was born so you can imagine how old it is let me
00:05:16.460 jump into the first one so the first one I'm going to talk about is about modeling cultural interactions and since
00:05:23.720 this is the year of Euro 2016 I just have to show you this picture anyone
00:05:30.320 understand is as opposed to most people Hansen's so cultural interactions right
00:05:36.440 so expectations and and how you would interpret them in terms of our culture
00:05:41.980 some other examples like Mickey Mouse in Disneyland how many of you have jeep
00:05:47.960 into the disneyland in tokyo if you
00:05:53.270 notice that Mickey Mouse actually speaks in Japanese and Donald up and everybody
00:05:58.460 else like so it was totally mind-blowing for me when i was there starbucks in
00:06:03.920 beijing in the Forbidden City Chinese food in America yoga I and also this
00:06:14.090 monstrosity I don't really know what to call it but apparently it's called pizza
00:06:20.020 so I think what I'm trying to show here is that culture do interact with each other and strange things happen
00:06:27.050 subsequently and somebody actually wrote something about it in 1997 American
00:06:32.990 political scientist his name is Robert Axelrod so I actually did some simulations based on his work in a
00:06:38.840 previous talk what he did was I mean he's also a complexity to research her
00:06:44.390 and national medal of science laureate so he's someone important and someone
00:06:49.490 really really smart so he actually built a model based on two assumptions a model of cultural interactions based on two
00:06:56.060 assumptions the first is that people are more likely to yeah similar to each other are more
00:07:03.920 likely to interact and when cultures interact with each other they become
00:07:09.320 more alike each other so I made everything back a little bit about the few examples I show you just now yeah
00:07:14.390 some other examples so what I'm going to do is I'm going to build a an agent-based model where each agent
00:07:21.380 represented culture and then I'm going to just roll them into into this experiment and I'm gonna run and we're
00:07:28.040 going to see what happens right so let's look at the experiment i define culture
00:07:33.620 as a set of features excuse me the features meaning like language religion
00:07:39.410 the way that you dress the kind of food you eat and so on and so forth and
00:07:44.620 within each feature there are a set of possible trades so trade is a possible
00:07:50.930 values of a particular feature and then i model the cultures in a 36 x discreet
00:07:57.830 where each cell represents a culture so each cell represents a culture and each
00:08:05.330 culture has six features and each feature has 16 possible trades right so
00:08:12.820 question is why did I choose 6 and 16
00:08:18.430 what do you think
00:08:24.020 yes no not at all in this group Thank You team so it's a lazy man's way
00:08:32.160 of representing the the model basically i used the colors right so hex color
00:08:40.290 code so red green and blue so six values and each one of these I have 16 possible
00:08:46.260 traits and so the simulation goes this way I have for every culture there they
00:08:53.370 have eight possible neighbors unless you're in a border or the corner and algorithm goes this way everything I
00:09:00.960 randomly pick n number of cells and I will compare the features of the culture with each of the neighbors if they trade
00:09:09.420 difference for the same feature within two cultures is less than T then I will
00:09:15.120 randomly copy the trait of the feature to the other culture a lot of works let me just show them to you in more visual
00:09:24.060 way so let's say a and B are two neighboring cultures and culture a has
00:09:29.460 this particular value culture be at this particular value I think each of the trade and compare them so in this case I
00:09:34.830 think the trade in a tree right in South tree and I compare them the difference
00:09:40.080 is true and do that for every single one of them and then I Adam it all up and
00:09:45.870 the difference between these two cultures become 34 right now of course if they are totally different then the
00:09:52.860 difference is 96 and if they are totally the same culture there is no difference
00:09:58.860 is 0 so the more similar the two cultures are the more likely they will be culture exchange therefore the
00:10:04.770 probability of this culture exchange is 1 minus difference between the two cultures dy by 96 simple arithmetic
00:10:11.900 calculate the probability in this case for that particular example and when the
00:10:17.190 interaction do happens I copy one trip to the other and therefore as you can see here um one trade sorry one culture
00:10:24.900 influences the other one trade is copied from one to the other and therefore something changes right there's an
00:10:31.650 interaction so what do you want to measure here so we actually run a simulation what do you want to measure
00:10:37.440 so in particular I want to run I want to measure three things the first
00:10:43.280 is the average feature distance this tells us how different each culture are
00:10:49.580 from each other at the end of each simulation each round of simulation the uniques tell us how many unique cultures
00:10:56.030 are at they are at any given point in time and lastly the changes tell us how
00:11:02.870 vibrant the cultural exchanges are right so I want to measure this tree particular points for the simulation and
00:11:11.000 I would just very quickly run the simulation so i'm going to use puma
00:11:24.950 we'll run the assimilation okay so is this life by the way so let
00:11:35.860 me just show you so I'm going to start the simulation you see from the chart on
00:11:41.920 the chart on the left you have blue which is the distance between the average distance between two cultures
00:11:48.610 and red is the number of changes and yellow is the number of uniques right um
00:11:54.940 you see over time the average distance reduces so over time you see as
00:12:01.090 experiment goes the differences we needed cultures actually reduced the number of UNIX also reduced because as
00:12:08.440 time goes by culture has become more and more like each other and finally you can
00:12:13.810 see that the differences for the changes there is always changed right and you
00:12:20.260 can see it here on the nicely kind of changing colors that you see that they
00:12:27.460 are so then big blotches of particular color that means basically a culture has
00:12:33.880 dominated that particular simulation right and what time you see that it
00:12:41.260 doesn't remain constant some parts of it turn greens understand the yellow and
00:12:47.020 other parts of it turned blue and I'm going to stop the this for now get back to the slides
00:13:01.290 oops okay so this is a simulation and
00:13:06.470 over a period of time because I'm going to run the entire simulation to the presentation but over time this is
00:13:12.449 roughly where you get the yellows which represent the number of UNIX actually
00:13:18.060 dropped to a particular constant the changes remain the same and the distance actually do not drop down to zero right
00:13:24.660 or doesn't actually drop below the number of uniques so let's let's look at some observations now again as I say
00:13:31.290 this is a simulation I didn't actually plan any of this and this are just observations from the simulation
00:13:37.670 eventual equilibrium is that there are a few dominant cultures that's also to be
00:13:43.199 expected so as cultures cultures interact with each other our one will eventually dominate the dominant
00:13:50.459 cultures can be quite different from each other so we look that colors can be very different from each other so if I
00:13:57.990 actually reduce the number this the area I use a 36 x 36 screen because it looks
00:14:03.569 pretty but I could use a smaller one or could use a bigger one small errors results in faster equilibrium and
00:14:09.600 smaller number of culture so you have a smallest geographic space you would expect that there's actually more
00:14:15.029 changes and there are fewer dominant cultures but I suppose one thing that I
00:14:20.100 didn't really expect in the end which I I observe here is that a culture that is
00:14:25.440 more dominant at one point time doesn't mean he would be dominant in the end like so and dominant cultures doesn't
00:14:32.190 mean one culture basic dominant theater basically in the end is not not when two
00:14:38.069 couches interact they form actually form a separate culture subculture disgrace on to and that's the one that becomes
00:14:44.639 dominant so in a way that's a simulation of how cultures interact if you want to
00:14:50.610 make more observations or you want to try various please feel free to take the code this one github so now that's
00:14:56.730 culture let me talk about another simulation about racial segregation
00:15:04.610 singapore's are really where I mean for those visitors here and you realize
00:15:10.640 Singapore is actually very racially diverse there's a lot of diversity here so adding racial segregation is
00:15:16.730 something that's important in Singapore and increasingly around the world you probably notice that you know as as
00:15:23.410 different people move into other countries and so on segregation of
00:15:30.310 people of different races or different cultures and background become increasingly important so I think this
00:15:36.589 is a interesting model but again tikka be the pinch of salt this is a computer simulation model this is not a
00:15:42.709 reflection of your life yeah so um in two thousand cattle graphic called go
00:15:49.880 Rankin what I did was he did a he took map and he basically took every like 25
00:15:59.209 people over the nape and then he would put a color to the race of that person of those people and he created a map and
00:16:07.120 what came out was quite startling you see that you blotches of blue pink and
00:16:13.250 orange and they represent the different races in in Chicago so very obviously
00:16:20.060 there is some kind of rich racial segregation sometimes at very sharp
00:16:25.370 edges you will look at the blues and and oranges says sometimes just a dividing
00:16:31.399 line to it so that's that's pretty stock is it only Chicago probably not so other
00:16:38.019 cities in in us were also i mean the same things happen in other cities and
00:16:44.449 same things were done for other cities and this came out as a result and in
00:16:50.269 detroit i think that seems pretty serious is like blue and pink New York
00:16:56.390 LA washington DC some other countries around the world will also miss some
00:17:01.610 other cities around the world also used and London also showed something
00:17:07.730 very similar so the question comes is like why is there such segregation London is supposed to be one of the most
00:17:14.360 racially diverse cities in the world like so what is really happening is it
00:17:19.730 inevitable is segregation inevitable so Thomas Schelling is like America
00:17:26.630 economists 2005 nobel memorial prize in economic sciences I made a mistake of
00:17:32.210 saying that this the Thomas Schelling won the Nobel Prize for economics once I had gotten really bashed up quite a lot
00:17:37.730 so we were careful to say that he actually won the nobel memorial prize in economic sciences so he wrote his paper
00:17:46.010 in 1971 called the dynamic models of segregation where I did a simulation on
00:17:51.290 how races were being segregated I the way that he did it it's very different
00:17:57.500 from the way that I'm gonna use today because what he did was basically use a grid of chopped I chop chop by Jove
00:18:03.500 coins and then he flipped the coins right whenever there's some interaction what I'm going to do today is absolutely
00:18:09.950 not that I'm not going to use coins I have computers to help me so I'm going to do a simulation and we're just
00:18:15.380 exactly the same model that I used early on 36 x 36 grid paint neighbors each but
00:18:21.560 i'll go ahead and slightly different now at everything i will check every cell instead of random set of cells if at
00:18:29.210 least n number of his neighbors are the same race I wouldn't do anything right so imagine every household occupies one
00:18:36.290 cell and if I find that the races around me are of a blur that my neighbors out
00:18:42.980 of the same race I won't do anything but if they are more racist / say that a
00:18:51.650 certain threshold there are more neighbors who are different race than I am then I would try to find somewhere
00:18:57.560 else to move to or pick an empty cell and I move there so what are the parameters I'm going to
00:19:03.100 measure and that I'm going to use to sort of do the simulation so first it's
00:19:08.590 acceptable number of neighbors which means like more neighbors the larger the
00:19:15.190 n is the more acceptable i am 22 I'm
00:19:21.130 actually okay with living with a diverse neighborhood the second one is the
00:19:28.150 number of races and the grid so I was out of with two but as you see I will increase the number of races in the
00:19:33.250 world grid then the percentage awaken cells of course you've got zero waken cells basically I can't move anywhere
00:19:38.890 therefore it's kind of useless simulation but if I increase the
00:19:44.110 percentage of vacant cells does that help is it better if it's more spacious would there be more segregation
00:19:50.230 otherwise so that's one of the parameter sampling wave and lastly i use a policy
00:19:56.380 limitation so this has been used in many countries where you say that certain percentage of a particular area cannot
00:20:03.040 have like a majority or certain race or so on so that's policy limitation I want
00:20:08.710 to also simulate to see whether is that useful is that something that would be helpful to reduce segregation so let me
00:20:15.640 run the simulation here
00:20:22.129 well well this is finishing you can see the dominant races here just now the
00:20:27.659 dominant cultures here so let me go here
00:20:34.230 so I set the example number of neighbors to be to number races to 20 percentage
00:20:40.960 cells are vacant and the courts are here basically the policy is 8 so we set up
00:20:48.100 again is randomly assigned what I'll do is I'll start the simulation and as you
00:20:56.799 can see the simulation quickly becomes segregated like so with this particular
00:21:02.649 set of parameters it becomes aggregate quickly so let me just increase the
00:21:07.740 acceptable number of neighbors set up again so you see early on that actually
00:21:13.870 large areas where there are people are segregated that's the son of simulation again okay maybe it's just too much yeah
00:21:20.919 okay let me just reduce it district let's do that again again you see
00:21:28.830 segregation but you know this is something different now it's a it's even bigger blotches and as you saw earlier
00:21:37.740 on as I increase teen acceptable number of neighbors and they started again
00:21:44.690 something strange happens it keeps moving it doesn't actually stabilize right it doesn't actually come to a
00:21:50.460 steady state so that tells us certain things as well so again I can change the number of races here
00:22:02.080 in syrup okay this is called you're wrong number
00:22:07.370 to use case set up and again you can see that there's
00:22:14.899 actually segregation with the races right in this stabilizes if I go up to
00:22:21.019 forward it actually does not save the lines so 24 you will go into a
00:22:27.320 continuous loop okay let me get back to the slides
00:22:35.059 I'm not going to fight right it's okay
00:22:42.100 okay so talk about that just now simulation again my observations and as
00:22:48.669 as I mentioned earlier on you can take the same code that I used and play around with it feel free to just walk
00:22:55.720 around with it and try out your own simulations so segregation happens even
00:23:01.000 there is a week preference for neighbors of the same time so even if I'm very acceptable to having neighbors of
00:23:07.870 different races segregation happens the weaker the preference though the last
00:23:14.110 aggregator would be so if I'm I am ok with diverse different kind of neighbors
00:23:20.679 then the clusters becomes smaller which is good news basically we are saying that the less racist via ha then the
00:23:30.909 less segregated we will be which is the good news first overall the stronger
00:23:36.190 preference the most a greater that's the opposite of what I said early on at a particular trash show though the strong
00:23:43.030 preferences we saw in an unstable but non-segregated state so it's good it's
00:23:48.070 good as nothing yet but at the same time people keep moving so that's no good that basically means is the chaos right
00:23:53.950 the where were the city is in the chaos the province of state whatever it is is
00:23:59.110 in constant move the number of races have no impact to segregation this is a
00:24:05.890 little bit surprising but you have two
00:24:11.500 different races you were segregated to klum's you have ten different races you just said look into 10 different clumps
00:24:18.260 and the number of weaken cells have no impact this occasion the larger the area is the more we can the city's there has
00:24:25.280 no particular impact you will still be segregated and lastly policy and for
00:24:31.730 sermon has limited impact on segregation so it does have certain impact because
00:24:37.610 you do get people to not be so segregated but at the same time if you
00:24:44.300 impose policy that are too strong what happens is that evil will result in an
00:24:49.970 unstable state again undesirable now what is whether the solution do is I
00:24:55.250 have no solution I'm just running a simulation right so I think this takes a simulation try it out and see what
00:25:01.760 happens okay so I've done two simulations now I'm going to switch
00:25:06.920 gears a little bit to something slightly different I'm going to model something called the bystander effect so this is a
00:25:13.910 very famous case um in 1964 a 19 year
00:25:21.230 old girl called kitty Genovese scene in New York recently graduated from high
00:25:27.020 school she was working in a bar and she was walking home and she was attacked by an intruder and what happened was she
00:25:32.930 was actually attacked a couple of times and it was is not in bright daylight but
00:25:38.240 it was close enough to an apartment block and apparently at least from his sensationalist headlines you say that 37
00:25:45.470 people saw the murder but didn't do anything about it so there was like a big wake-up call for a lot of people say
00:25:51.560 what's happening to this world I'll be so in so desensitized to violence that
00:25:56.960 we no longer care about people around this and that was in 1964 in october two
00:26:03.860 thousand one in fossa in china to yo
00:26:10.430 girl was hit by a white man and then run over by a truck again and
00:26:16.269 she was basically not helped by any bystander until about I mean like a
00:26:22.359 large group of people basically not her until after a while until she was actually rescued a quite a while later
00:26:29.249 unfortunately she passed away as a result of that so again there's another example of what certain people call the
00:26:36.070 bystander effect and the bites and the effect is something that's been talked
00:26:42.820 about in game theory so this is the next thing that I'm gonna model game theory is a study of medical medical models so
00:26:50.259 it's mathematics more mathematics here within conflict and cooperation and is using kannamma in political science and
00:26:57.089 it's an example i will use for complexity science as well so the particular topic is what is called
00:27:04.539 volunteers dilemma basically if there's a large group of people witnessing
00:27:10.029 something and if he or she does he or she volunteer or not because there is a
00:27:17.049 cost of volunteering and there's a cost of not volunteering right so modeling that in with mathematics you have
00:27:25.709 variables your V which is a value gain if at least one person volunteers which
00:27:30.789 means like if somebody actually volunteer and shout and say stop stop
00:27:36.129 their murder or whatever T for rubber or whatever then that act would have been stopped but at the same time individual
00:27:43.359 cost of volunteering for volunteer maybe it's except their time maybe cause attention to them and therefore they
00:27:49.779 will be attacked instead of for the other person and of course there's overall cost is no one volunteers so what's what's the cost of that when no
00:27:57.039 modern volunteers in which case we saw earlier just now people actually died when the one volunteer so it is actually
00:28:02.409 a tragedy so um game theory actually has
00:28:07.989 something called a payoff matrix while the payoff matrix for those cases our
00:28:13.869 show just now is much larger let's just start off we have a two player game right if you and me and in the case of say so
00:28:25.270 you have you and me and you form a matrix where I volunteer or you volunteer in cases where I want I
00:28:35.440 volunteer and you volunteer or you don't volunteer it seems to be the same there doesn't seem to be any difference if I
00:28:42.700 don't volunteer whether you volunteer you don't volunteer there's a difference like so veem and B minus a so using this
00:28:50.880 payoff matrix we come up with something called a mixed strategy Nash equilibrium which is actually the best case for the
00:28:58.120 game theory so we have this equation V minus C equals to the P P is the
00:29:04.000 probability of volunteering multiplied by V class and so on and so forth so with this particular equation we change
00:29:13.120 it to an N player game so instead of having just two players and change to n player and the probability of
00:29:18.790 volunteering becomes n minus 1 so instead of P becomes P to the power n minus 1 right and therefore the
00:29:26.320 probability of non volunteering is 1-3 to the power n minus 1 because we put everything together you come up with
00:29:32.020 this particular formula and then you derive it you get into a final formula
00:29:38.470 right so that's the mathematics part of it what I do with this particular probability then I run some multicolored
00:29:46.780 simulation on it and i'll show you the the model
00:29:55.059 not still going on you all right so this is the volunteers dilemma so the x axis
00:30:06.279 is the number of agents basically number of players are number of witnesses or what have you not and the y-axis is the
00:30:13.749 probability of volunteering right weather is it likely that somebody would volunteer and say something about it
00:30:18.879 right so let's look at increasing or decreasing the cost of volunteering so
00:30:25.509 if I decrease the cost of water to ring naturally we would realize that okay is
00:30:31.480 less costly to volunteer therefore more people want and as you can see that that's true right because avis becomes
00:30:42.460 trivial for me to volunteer then there will be more people volunteering right
00:30:47.850 and of course if the cost of volunteering is larger than the overall
00:30:53.499 cost then there's nobody who volunteer now let's look at it their way around the world cause of volunteering if the
00:31:01.840 overall cost of voluntary if I don't volunteer and do something then there will be total disaster if I don't shout
00:31:10.090 fire when there's a fire the whole block burns down everybody dies right so
00:31:15.100 that's the worst case possible so if i increase the cost of volunteering of not volunteering then basically what happens
00:31:24.490 is that of course the likelihood or somebody ball entering here increases as well
00:31:30.509 so you can see something interesting though as I assure this particular model
00:31:36.229 at the start the probability of with the
00:31:41.570 three agents with three players the probability of volunteering is sixty-two percent as the number of agents
00:31:48.089 increases the probability of volunteering actually decreases so we
00:31:53.759 sort of reflect back to the example that I gave just now kitty genovese see so if
00:32:01.199 there was actually just one person witnessing the murder then the probably of that person actually shouting out and
00:32:08.819 are really scaring off the attacker might be a lot higher than say there
00:32:13.829 were 37 people right so you can see that that big model here so let's look at the the last piece that I wanted to the last
00:32:22.169 parameter that I wanted to tomorrow increasing the number of agents
00:32:30.410 do you realize what's happened here doesn t matter how many agents they are
00:32:36.920 the probability is just the same it doesn't mean that the more people when this is something then the more likely
00:32:42.920 somebody will shout out loud yeah this that's not the case now what does this tell us well let me get back to the
00:32:49.700 slides again I'm just running a
00:32:54.860 simulation I'm not telling you what happens you decide I mean this is a simulation it's not a real life
00:33:01.720 observations what can we do so first of all we need to decrease the cause of
00:33:07.100 volunteering if you don't want such strategies tragedies to happen we should decrease the cost of volunteering make
00:33:13.160 it easier for people to volunteer increase the world casa or impaired normal and drink of course we don't
00:33:19.430 really want that right you don't want to make things worse if nobody volunteers but nonetheless that is a way of
00:33:24.880 increasing the probability of symbolic voluntary volunteering you notice again
00:33:30.680 is actually not really about the exact number the absolute numbers but it's the difference between the individual costs
00:33:36.500 and the world costs so maybe the answer is relate to while you reduce the overall cost but you also making such a
00:33:43.370 difference between the individual costs and the workhorse up very high right
00:33:48.670 this is a bit counter interact intuitive because you reduce the number of players
00:33:54.350 a number of agents basically what happens is there is likelihood of
00:33:59.690 probability the probability of voluntary increases so that's that's maybe what should happen not necessarily for a an
00:34:08.360 accident in such case it's um it's possible because just so many people
00:34:13.640 there but maybe there are other ways or maybe anonymously volunteering Mike so that could be a way of doing something
00:34:19.000 to reduce the number of players and finally the observation here is that
00:34:24.280 increasing the number of players have negative or even have no impact or even a negative impact on the overall
00:34:31.440 probability of volunteering think so hope you guys have still with me Yeah
00:34:38.139 right I actually come to the end of the that is talk however has been
00:34:43.990 interesting for you if you want to you can actually take a look at the github
00:34:49.690 repository I have this here just feel free to play around with it and you have
00:34:55.179 any sort of questions please feel free to ask so thank you thank you very much
00:35:06.550 young anyone would anyone like to post crushed a father okay
00:35:19.580 their switch here oh okay yeah um you said if we shouted and screamed you'd
00:35:25.580 show us a bit of the code so i thought i'd ask this question you showed
00:35:32.050 basically three different simulations and i'm wondering what commonalities
00:35:39.800 underline the code maybe that would be an interesting way to show off the code
00:35:46.010 what do you think oh I can show you you don't need to shout actually um thank
00:35:53.630 you the previous slide so mmm basically
00:35:59.960 this is I just I you seen otra because it's simple enough right see now try and
00:36:06.290 JSON and equate to eight different simulations here the first one is the culture simulation and uses a grid so
00:36:14.800 great simple is quite simple kind of algorithm culture i use a combination of
00:36:25.820 the bit mask to sort of make it to tell differences and some very simple items
00:36:33.650 to just find the distance between the two diff two different numbers and so on no man I didn't want to show it because
00:36:40.760 it's not very sexy know who know very complicated right it just so simple
00:36:46.480 simple code is sexy thank you I wasn't
00:36:53.000 fishing for it by the way um yeah so
00:36:58.730 hope that it's okay
00:37:04.240 okay thank you regarding the Bonne terres dilemma where it is personal
00:37:10.060 choice freedom of thought come in in terms of attitudes of so I think
00:37:17.020 definitely those things do count except that this is a magic medical model so it
00:37:24.040 basically models large group of people something like psychohistory you guys
00:37:29.170 realize them off no nevermind but it is status a status that code status
00:37:36.070 tactical whatever sabatham ethical modeling so yeah individual will does
00:37:43.660 count by ring in this case I didn't put at it into the model okay thank you very
00:37:52.810 much Sasha thank you
Explore all talks recorded at Red Dot Ruby Conference 2016
+17