But what happens when a client’s "choice" is based on disinformation that threatens their life or others?
Increasingly, welfare eligibility, child protective services triage, and housing allocation are being run by predictive algorithms. A machine flags a family as "high risk" based on zip code data, not clinical observation.
The changing society demands a new nuance: We must now ethically assess whether a client can consent when their information ecosystem is weaponized. 3. The "Efficient" Algorithm vs. The Human Relationship Social justice is the third pillar. But what happens when the systems we rely on to distribute justice go black box? social work ethics in a changing society
The ethical question is this:
So, how do we practice "Person-in-Environment" when the environment is unrecognizable? Here are three ethical friction points defining social work today. The core ethic of Confidentiality is under siege. But what happens when a client’s "choice" is
The first thing you learn in Social Work 101 is the Code of Ethics. It feels solid—a laminated compass designed to guide you through murky waters. Confidentiality. Self-determination. Social justice. Non-maleficence.
But what happens when the society those ethics were written for changes underneath your feet? The changing society demands a new nuance: We
In the past, privacy meant a locked filing cabinet. Today, it means navigating a nightmare of group chats, telehealth glitches, and third-party apps. Consider the school social worker who asks a teenager about their weekend. The teen mentions a fight with a friend on Instagram. The social worker now has a choice: Do they look at the public story to verify the risk? If they see a post about suicidal ideation, do they screenshot it? Does that screenshot become part of the clinical record?