Showing posts with label privacy versus social responsibility. Show all posts
Showing posts with label privacy versus social responsibility. Show all posts

Thursday, April 02, 2015

An app to promote social responsibility


This post is a follow-up about people who carry critical responsibilities involving others’ lives. Stressful events do occur as often in their lives as in the case of others’. Only people very close to them may be aware of these stressful events – friends, family, and doctors. The doctor may schedule a biopsy for a pilot the following week and the pilot might be flying an intercontinental flight that night. Events do not always occur giving adequate recovery time between them. Sometimes they pile on an individual, creating extreme stress.

Let us assume that carrying critical responsibility over other peoples’ lives means that one might need to make some compromises in regard to one’s own privacy. Assume that the law gets amended if necessary, to permit the kind of monitoring I propose below; an appropriate set of safeguards would have to be incorporated. The individual concerned would enroll in a program, identifying five or more people close to him, whom he authorizes to report weekly about any stress he might be under and other circumstances that need the attention of his supervisor. His contacts use an app to send a machine processable report weekly indicating any stress, incident, or other problem. Alternatively, there would be a positive report saying that the reporting person had contact with the monitored person and was not aware of anything to cause concern. The contacts would of course have to be educated and responsible people, well briefed and knowledgeable. Most of the time a computer would crunch the incoming reports and file them away in some well-protected manner. Exceptional circumstances would be "recognized" by an algorithm and reported to the supervisor immediately. 

I am not saying that any such scheme should be implemented in the real world right now. All I would like to see for the present is discussion of the issues involved and carrying out of psychological experiments. Perhaps there is way to make peoples’ lives a little safer. We could do it, hopefully, without being unfair to people who carry onerous responsibilities, and without dehumanizing ourselves in implementing the system. Caring for others’ lives need not conflict with compassion for individual under stress or other unfortunate circumstances.  

Saturday, March 28, 2015

Social responsibility and detecting depression


I usually post what I think are interesting project ideas on this blog. This is an exception – I have no answers, but do have what I think is an interesting question. The tragic death of nearly 150 passengers and crew in the crash of Germanwings flight 4U9525 was the trigger that set me thinking. The allegation that the co-pilot might have intentionally crashed the plane raises the question: can we detect such risks in advance and possibly prevent them? CNN was discussing if there should be cameras in aircraft cockpits, but someone pointed out they are unlikely to help in preventing crashes; they can only help in the investigation that follows the crash.

Let me leave behind the specific incident of 4U9525. We don’t know enough about it to discuss it any further.

The more general question is about people who carry critical responsibilities – surgeons who handle very tricky operations, pilots, people whose staff carry a box with nuclear trigger buttons on them, and those who sit in cold, underground bunkers waiting for a signal to launch missiles. I would also add to this lot millions of car and bus drivers whose actions/in-actions can cause deaths. What if they had been in bar fight the night before, or if their girlfriend had told them to get lost, or if a doctor had told them that he would like them to take a test for early signs of Alzheimer’s disease. These situations are very personal ones. Only people very close to the individual know about them in time to prevent tragic developments – people like the spouse, boyfriend/girlfriend, doctor, etc. 

Is there a role we can visualize for people very close to a person carrying critical responsibilities in protecting him/her, along with protecting hundreds or thousands who may be endangered by an act of desperation? As an example, ask yourself this: if you were a pilot, does your spouse have a social responsibility to avoid your doing something desperate? Or does your right to privacy override your spouse’s social responsibility? Can a loner carry a critical responsibility, or is it better for anyone carrying a major responsibility to have the support of a few trusted people? How would information about risks be shared and what privacy safeguards would be possible?

I do not know the answers, but you should think and ideally write about this topic. Students of technology do not usually face questions like this, but an education in science or technology cannot be complete without some attention to issues like this. Another possibility is collaboration between people who could put together an app to facilitate sensitive messages, and people interested in experiments in psychology, to do collaborative work to gain insight into this question. This suggestion brings to my mind a Wikipedia article I had read years ago http://en.wikipedia.org/wiki/Stanford_prison_experiment  That article alerts one to the risks on psychological experimentation involving human beings and their behavior under conditions of extreme stress.