From Grey's Anatomy to Queen Charlotte: The evolution of Shonda Rhimes' narrative

Shonda Rhimes, known for producing some of the most successful television series of the last decade, has left an indelible mark on the entertainment industry with works like Grey’s Anatomy, Scandal, and Bridgerton. It is undeniable that she is a powerhouse of entertainment and has a knack for hooking us into series. Her ability to create captivating narratives has led several of her productions to become cultural phenomena. Among her successes, Grey’s Anatomy has spawned two spin-offs: Private Practice, launched in 2007, and Station 19, which debuted in 2018, both well received […]

Shonda Rhimes, known for producing some of the most successful television series of the last decade, has left an indelible mark on the entertainment industry with works like Grey’s Anatomy, Scandal, and Bridgerton. There is no doubt that she is a powerhouse of entertainment and knows how to hook us into series. Her ability to create captivating narratives has led several of her productions to become cultural phenomena. Among her successes, Grey’s Anatomy has spawned two spin-offs: Private Practice, launched in 2007, and Station 19, which debuted in 2018, both well received by the audience.

The construction of good characters; there lies success

The latest project by Rhimes in this universe is Queen Charlotte, a prequel to Bridgerton that is available on Netflix. This production has been added to the growing list of content that explores the complexities of the characters and the world that the creator has built. Queen Charlotte not only expands the story of Bridgerton but also offers a deep look at one of its most intriguing characters, taking advantage of the series format to enrich the narrative background.

In a recent interview, Rhimes shared her vision on content production and what she considers essential when developing new stories. The producer emphasized the importance of creating three-dimensional characters and meaningful plots that resonate emotionally with the audience. As she progresses in her career, her focus remains on authenticity and quality, elements that have characterized her work and helped her maintain a strong connection with viewers.

With the success of her series and the expansion of her narrative universe, Shonda Rhimes has established herself as a key figure in contemporary television. Fans of her work eagerly await what the future holds, especially in a landscape where original and well-crafted content has never been in such high demand.

Facebook shareholders vote to overthrow Zuckerberg (it doesn’t work)

Shareholders vote for a new Facebook chairman. Why won’t it work?

Zuckerberg

Nearly 70% of Facebook’s shareholders voted to reduce Mark Zuckerberg’s power in the company. This was revealed in a recent filing from Facebook from their shareholder meeting.

Last year, 51% of shareholders voted for an identical proposal. This year, even more Facebook stock owners are tired of how Zuckerberg has been conducting business.

Shareholders are upset with Zuckerberg over several major Facebook scandals. If you’re wondering what scandals they’re referring to, we can name a couple:

What do the shareholders want to do?

Shareholders are hoping to hire an independent chairman to hold Zuckerberg and his top team accountable.

Essentially, the shareholders are tired of Zuckerberg and his team not being held accountable. Hiring a chairman could put forward a set of checks and balances that would keep them from causing more harm to Facebook and its users.

7

Is anything going to come from this?

No.

The issue with hiring a chairman to oversee Zuckerberg is that he owns 60% of the company’s voting power. Zuckerberg and his team voted against hiring an independent chairman, so it’s not happening. Basically, no matter what the majority of shareholders propose, Zuckerberg always gets the final word.

This goes to show the importance of a chairman. Despite Zuckerberg’s majority voting power, a chairman could potentially approve or disapprove decisions made by Zuckerberg and his team. This would provide a new layer of accountability sorely needed at Facebook.

Should Zuckerberg not have as much power?

If you keep up to date with Facebook news, you’d know that there seems to be a new story about a Facebook scandal every week. Although Facebook will make a claim from time to time that they’re trying to make a change, the constant scandals speak for themselves.

To make matters worse, some scandals involved Zuckerberg’s direct actions.

Although Zuckerberg might be the founder of Facebook, his actions might continue to put the company in peril. If Facebook wants to outweigh the negative news with positive news, then they are going to need to make changes like these.

Facebook deletes 2.2 billion fake accounts

Facebook deleted more than 2 billion accounts. Here’s why you should care.

Fake Facebook accounts

From January through March of this year, Facebook deleted about 2.2 billion fake accounts from the platform. That’s twice the normal number of accounts they delete in a three-month span. Facebook released this data in a recent blog post. 

7

What are these fake accounts even used for?

Facebook VP of Analytics Alex Schultz wrote about the prevalence of the fake accounts.

“The number for fake accounts actioned is very skewed by simplistic attacks, which don’t represent real harm or even a real risk of harm,” Schultz writes. “If an unsophisticated, bad actor tries to mount an attack and create 100 million fake accounts — and we remove them as soon as they are created — that’s 100 million fake accounts actioned. But no one is exposed to these accounts and, hence, we haven’t prevented any harm to our users. Because we remove these accounts so quickly, they are never considered active and we don’t count them as monthly active users.”

Fake Facebook account

The vast majority of the deleted accounts were ones for boosting likes. The idea is that if your page has a high amount of likes, they are more believable and influential.

However, the problems with fake accounts are much more serious than Schultz would have us believe. There were still about 18 million fake accounts that violated policy for toxic behavior that Facebook deleted after an appeal. The violations include:

  • Spam
  • Adult nudity/sexual activity
  • Hate speech
  • Bullying/harassment
  • Violent/graphic content
  • Inappropriate use of drugs
  • Inappropriate use of firearms
  • Terrorist propaganda
  • Child pornography

Counting those that were appealed, Facebook deleted about 4 million posts that promoted hate speech. 

Facebook determined that for every 10,000 views, 25 of those views saw violent/graphic material. Also, 11-14 views saw sexual content or adult nudity.

For a better understanding where those numbers come from, check out chart from Facebook:

Facebook data

What’s the significance of this?

Although Schultz claims that the fake accounts were used to boost likes for “bad actors,” more specifically, these fake likes are also used to boost influence for politicians. 

Last year, Facebook deleted 32 accounts across Instagram and Facebook that used fake accounts to boost likes. These accounts were used to help politically influence Facebook users during midterm elections.

That was 32 accounts, but with today’s news of 2.2 billion deleted accounts, we can actually see just how prevalent this was.

Facebook is technically doing better at detecting harassment and fake accounts

Facebook wrote in the same blog post that about 95% of the fake accounts they found were discovered proactively. This means that they were deleted before they could cause any sort of harm. 

When it comes to deleting users and pages that promote hate speech, Facebook now detects 65% of the problematic pages. That number is up from an abysmal 24%. For a better idea of how it has improved, check out this infographic from Facebook.

Facebook data

Facebook is also doing a better job of detecting accounts that are inappropriately posting about drugs and firearms. However, here the difference is less significant.

Facebook regulated goods

How does this affect you?

If you are being harassed on Facebook, the social media platform has plenty of tools to help you. 

However, the cause for the concern comes from the fake accounts used to boost a page’s influence. The number of likes a page has should not dictate how they influence you. 

If you are looking at an influencer’s Facebook page, you should judge them from their actions and not from their like count.

Facebook braces for users who will erase their data

Facebook is letting users clear their data history, but how will this affect the ad experience?

Facebook ads

Before launching its new “clear history” feature, Facebook is prepping advertisers for the change.

Earlier this year, Facebook announced they were adding a “clear history” feature. The feature would show Facebook users what apps and websites were sending data to Facebook. You could then clear the data from your account, and then delete all the information currently collected. 

Facebook said in a blog post that there are four key takeaways for advertisers as they unroll the clear history tool:

  • Giving people transparency and control is good for businesses
  • We’re showing people how advertisers use our tools
  • This feature may impact targeting
  • Measurement will remain intact

7

Giving people transparency and control is good for businesses

Uninstall facebook

This point was more of an explanation as to why they are doing it. Facebook and the apps under its umbrella: Instagram, WhatsApp, and Messenger are all free for users. 

This is only possible because of advertisers. Facebook did not flat-out say that they were unethically collecting user data for more targeting advertisers. However, they did say that the company can provide advertising options while also protecting users’ privacy. 

Facebook has been under fire for how they’ve been collecting user data. Trust us, we’ve been covering it:

Along with the bad publicity, many users have been deleting Facebook. According to a report for the Pew Research Center, 44% of young American Facebook users deleted the app. Basically, Facebook has realized that they can’t advertise to users if there are no users.

Showing people how advertisers use Facebook’s tools

Facebook biz

This one is a bit self-explanatory. Facebook has data on its users; now, they want users to know what advertisers are doing with it.

Here’s where the problem lies: Facebook hasn’t given us the resources that show 100% clarity of what is happening with our data.

According to Facebook’s data policy, they collect data “including information about your interests, actions, and connections – to select and personalize ads, offers, and other sponsored content.”

You can also select your advertising preferences with Facebook.

However, this doesn’t cover everything by a long shot. Recently, it was discovered that Facebook was using user data as a leveraging point to gain advertisers.

In one case, Facebook gave Amazon extensive user data information. This was because Amazon was spending money on Facebook advertising.

This feature may impact targeting

Facebook is clarifying that if a user disconnects their data, their data can no longer be used for targeted ads. Basically, many advertising strategies are going to have to change as a result.

Facebook was recently charged by HUD for discriminating users with how they targeted ads. Basically, companies could refuse to advertise to users due to their race, gender, and sexuality.

Measurement will remain intact

For its advertisers, Facebook has tools for measuring how ads are performing. 

However, these tools will NOT give businesses access to personal data (not that it ever did) Basically, this is a feature that will not be changing, and Facebook wants advertisers to know that.

What should you take away?

Facebook has been in a lot of hot water over how it has handled user data and privacy. In fact, they may be fined up to $5 billion by the Federal Trade Commission over privacy issues.

With scandals, fines, and people leaving Facebook in droves, Facebook is finally making some changes. They are also making sure that their advertisers know it is coming.

Is this going to solve all of our problems with Facebook collecting our data? As history shows, probably not. However, this does look like a step in the right direction.

Facebook moderators face PTSD from graphic content

As toxic as Facebook can be for us, it is even worse for Facebook moderators.

Facebook moderator

We first heard about the lives of Facebook’s moderators last year when Motherboard published an inside look at the company’s content moderation process, policies, and challenges.

Instead of enjoying all of the tech company perks from cereal bars to craft beer and foosball,  content moderators see high instances of drug abuse, PTSD, and anxiety disorders.

Moderators often work as third-party contractors in a low-pay, high-stress environment. These people see the absolute worst sides of humanity from blood, guts, and gore to hate crimes and abuse.

Unfortunately, it doesn’t seem like much has changed since August 2018. Facebook still treats moderators like second-class citizens. Worker safety still falls by the wayside.

A stark inequality problem

inequality

A couple of months back, The Verge’s Casey Newton wrote an investigative piece of his own. He found another set of disturbing revelations about Facebook and its content moderators.

The piece primarily covered issues facing the moderation staff like exposure to violent content and conspiracy theories. However, the report also highlighted the vast difference in the work environment and compensation between typical Facebook workers and content moderators.

The average Facebook employee earns about $240,000 per year. The average person working in the company’s Phoenix-based moderation center earns $28,000.

Facebook’s latest campus, designed by architect Frank Gehry, boasts its own redwood forest and plenty of green space. It is ideal for decompressing after a stressful day on the job. Content moderators don’t have that luxury.

The psychological impact of violent content

Stressed out lady

According to Psychology Today, violent content from TV news can increase PTSD, anxiety, and depression. Reportedly, watching the news cycle after a mass tragedy can increase your chances of developing something called vicarious traumatization.

Facebook Moderators, by contrast, aren’t seeing humanity’s worst events through the filter of the cable news station. They’re on the front lines. So as one might imagine, the impact of these videos is likely much more significant.

7

The Verge report describes one instance of a woman named “Chloe” who is asked to moderate a Facebook post in front of a group of trainees.

The post in question depicts a man being stabbed to death while begging for his life. Chloe’s job in this scenario is to tell the group whether the post should be removed. Most people never have to see this type of thing, much less calmly describe the scenario to co-workers during routine training.

The recommended remedy for this vicarious traumatization is to step away from disturbing content, take a break from the news, and try to do something positive. Valid, sure, but when it’s your job to look out for this stuff, there’s no real escape. That’s a tremendous amount of stress to carry around for $28k a year.

It’s clear that Facebook knows the value of creating positive workspaces for their employees — again, see the redwood forest as a point of reference. However, it does not look like they are doing enough for their moderators.

Conspiracy theories spread

conspiracy theory

The effects of graphic content can have a huge impact on the well-being of the people who work inside the moderation centers. However, Newton found another disturbing trend that has some wider negative implications for society.

Apparently, reviewing conspiracy theory posts spreads the information contagion. Basically, those flat-earthers might be doing even more harm than they realize.

The article mentions a moderator who says he “no longer believes 9/11 was a terrorist attack.” Another walks the floor talking about the flat earth theory.

Who should deal with content moderation, anyway?

Facebook on pc, mobile

Moderators are taking action. Two former moderators have joined a lawsuit against the company, citing that symptoms of PTSD were brought on by reviewing violent images on the job.

As it stands, Facebook says that about 38% of the hate speech they detect is done using AI. The company has plans to improve its AI moderation effort, but there’s a long way to go. Still, there needs to be a human moderator to make sure nothing heinous slips through the cracks.

Unfortunately, protecting the rest of us from Facebook’s horrors comes at the expense of a safe workplace and a healthy mind.

Facebook cracks down on personality quizzes

Personality quizzes may be leaving Facebook in an attempt to protect our data.

Facebook quizzes

We’ve all seen them: personality quizzes on Facebook. They’re all pretty much the same. Your balding, overweight Facebook friend posts quiz results stating that if they were a superhero, they would be Wolverine.

7

Although these can be fun from time to time, we wouldn’t miss them if they bit the dust.

Facebook recently updated its platform policies. The update says that apps with minimal utility, such as personality quizzes, “may not be permitted on the platform.”

According to a company spokesperson, personality quizzes and other apps will heavily be scrutinized

“The update also clarifies that apps may not ask for data that doesn’t enrich the in-app, user experience,” wrote Director of Product Management at Facebook Eddie O’Neil in a blog post.

What does this have to do with the Cambridge Analytica Scandal?

massive password and email leak

The Cambridge Analytica Scandal occurred a few years ago when millions of Facebook users’ data was stolen. The stolen data fell into the hands of political campaigns.

At the heart of the scandal was a quiz.

A quiz was created using the Facebook app “thisisyourdigitallife.” The quiz collected data from about 87 million users. The people who took the quiz allowed the quiz to collect their data, but they didn’t know that it was going to go to aid politicians. As a result of the scandal, Facebook ended up deleting several hundred data-stealing apps. 

Where do personality quizzes fit into this?

Facebook personality Quiz

Believe it or not, personality quizzes can reveal details about your password and security questions. Quizzes can ask questions that ask for things like the name of your first pet, or who you saw at your first rock concert. Both of these are common security questions that can lead to someone logging into your account.

Facebook has also said it is revoking expired permissions for apps. Apps that haven’t used or accessed  permissions from Facebook users during the last 90 days “may be considered expired.”

Is this going to make a difference?

Mark Zuckerberg

Facebook has undergone a great deal of scrutiny over the past few years. The U.S. Department of Housing and Urban Development recently charged Facebook with discrimination. This, paired with the Cambridge Analytica Scandal and other data-stealing scandals, has made users cautious of Facebook.

Along with users deleting their account altogether, about 40% of users have taken a break from the social media juggernaut.

We can’t say for certain whether or not this move will remove personality quizzes altogether. We also don’t know if it will put a decisive dent in the war against data theft. This is a step in the right direction. Only time will tell if that step is a tip-toe or a leap.