Do The Recent Acts Of Violence Promote Fear ?
After the recent violent acts that have happened in the country. The over all emotion that is expressed the most is fear. It is very traumatic to see any one's life being taken especially on camera.
We are always exposed to extreme acts of violence in this country, whether it is police being murdered for simply doing their jobs, or black men being shot down while being un armed. This is definitely a time in American history that will be remembered for the random acts of violence.
We all know violence isn't the answer but why are these acts so widely promoted and have become a normal thing. Do we really live in a society that you have to be scared of day to day ? or do you feel safe and these random acts of violence are just that "random".