צפה בנושא הקודם :: צפה בנושא הבא |
מחבר |
הודעה
|
חיים יטיב דובר של `נקים`


הצטרף: 23 ינו' 2006 הודעות: 1013
|
פורסם: שבת 30.01.21 22:32 נושא ההודעה: זה מוכח.כמה חיים היו ניצולים אלמלא הצנזורה והסילופים של פייסבוק,הממסד והתקשורת בקורונה |
|
|
זה מוכח.כמה חיים היו ניצולים אלמלא הצנזורה והסילופים של פייסבוק,הממסד והתקשורת בקורונה.
פייסבוק מחקו שוב ללא התראה ועתה לחלוטין את הדף של "נקים" עם מעל 1200 עוקבים עם עליה מטורפת של צפיות ועוקבים בחודש האחרון. הוצאתי אלפי שקלים במימון פוסטים אשר הגיעו למאות אלפי צפיות מאות שיתופים ביניהם "יש תרופה לקוקונה זה מדעי" אשר פרסם בין היתר את ההידרוקסיקלורוקי, האבץ, הויטמין די ו-סי כתרופה לקורונה כאן:
http://www.nakim.org/israel-forums/viewtopic.php?p=276086
ואף מעל אלף שיתופים עבור תמצית התלונה הפלילית נגד מקדמי חיסון הקורונה אשר הוגשה בצרפת.
http://www.nakim.org/israel-forums/viewtopic.php?p=276087
עתה מתפרסם בכל העולם כתבה על כך שפייבסוק חזר בה מהצנזורה של פוסטים אשר פרסמו את ההידרוקסיקלורוקין כתרופה לקורונה, זאת אחרי שהמאוד נחשב אמריקן גורנל אוב מדיסין פרסם מאמר מדעי על הטיפול בקורונה המכיל את ההדירקסיקלורוקין כתרופה מרפאה.
https://www.amjmed.com/article/S0002-9343(20)30673-2/fulltext
כתבות זועמות מציגות כי בגין הצנזורה של פייסבוק ודומיו קרוב לחצי מיליון אנשים נפטרו בארה"ב.
https://www.conservativenewsdaily.net/breaking-news/after-440000-ameri cans-are-dead-facebook-and-american-journal-of-medicine-admit-their-st and-on-hcq-was-wrong-these-people-should-be-prosecuted/
ואצלנו כמה אלפים קרוב לוודאי מתו בגלל הסתרת מידע מדעי זה מהציבור כפי שניסיתי להפיץ באמצעות האתר נקים והדף פייסבוק שלו שנמחק כאמור היום לחלוטין על ידי פייסבוק.
הצנזורה של ההידוקסיקלורוקין, להלן הקלורוקין, כתרופה ושאר התרופות אותן פרסמתי כבר בחודש דצמבר לא היו מקריות, בצרפת השלטונות עצרו את המחקר דיסקוברי על הקלורוקין באמצע בקיץ האחרון, ארגון הבריאות העולמי פרסם מאמר המפרט מדוע הוא עצר את הניסוי הקליני לקורוקין ביוני 2020 בהסתמך על הפסקת המחקר הארופאי
https://www.who.int/news-room/q-a-detail/coronavirus-disease-covid-19- hydroxychloroquine
בצרפת הינו עדים לרצח אופי ולאיומים שהטילו על פרופ' דידיה ראולט מומחה עולמי לוירוסים אשר התעקש לטפל בחולים עם הקלורוקין והמשיך לפרסם על יעילותה.
כיום כולם מבינים שהוא צדק והחשד הוא שלא במקרה השלטון ברשות הנשיא מקרון אסר בתחילת 2020 את הקלורוקין למכירה חופשית בבתי מרקחת כפי שהיה נהוג עשרות שנים. כנראה שגם לא במקרה שר הבריאות של מקרון אסר בצו לרופא משפחה לתת קלורוקין לחולים הקורונה על בסיס של מאמר מדעי שקרי ב-"לנסט". המאמר ב-לנסט בסופו של דבר הוכר כסילוף והתנצלו על כך אבל שר הבריאות של צרפת לא ביטל את הצו שלו.
החשד הוא כי כל מאמצי הסילוף האלה נגד הקלורוקין לא היו סתמים אלא מכוונים היטב לאלץ אנשים להתחסן בחיסונים של פייזר ומודרנה כמוצא אחרון לצאת מגזרות הקורונה וחוליה, כאשר הזהרנו כאן באתר מזה חודשים על הסכנות של החיסונים ומה עומד מאחוריהם.
ראו לעיל על התלונה הפלילית אשר הוגשה בצרפת.
וראו כאן את המאמר שפרסמנו במאי אחרון אשה הזהיר על העתיד לבוא והתברר לצערינו מדוייק להפליא:
מותו של השגריר הסיני בישראל, היתכן שנרצח כי טען שהוירוס הובא לסין?
http://www.nakim.org/israel-forums/viewtopic.php?t=270564
מבקש מכל מי שיכול ומבין את המשמעות של פוסט זה לשתף בכל הכוח, מדובר בהצלת חיים.
בתודה.
חיים יטיב
nakim.org
Description: |
Pathophysiological Basis and Rationale for Early Outpatient Treatment of SARS-CoV-2 (COVID-19) Infection |
|
 Download |
Filename: |
american journal of medicine covid early carePIIS0002934320306732.pdf |
Filesize: |
783.06 KB |
Downloaded: |
1178 Time(s) |
_________________ הצטרפו לערוץ של נקים בטלגרם להתעדכן:
https://t.me/Nakim_org_Ch
הסקר קובע: רוב הציבור בישראל סבור שהשחיתות פוגעת בחייו היום יומיים,
חתום על העצומה נגד שחיתות הממסד ומערכת המשפט והצטרף ל"נקים"
ניתן להשיג את חיים יטיב הדובר של ארגון "נקים" באמצעות טלגרם @HaimYativ או דוא"ל haim@nakim.org
השופט מישאל חשין:המלחמה בשחיתות היא מלחמה להגנה עצמית בה לא לוקחים שבויים
נערך לאחרונה על-ידי חיים יטיב בתאריך ראשון 31.01.21 22:05, סך-הכל נערך פעם אחת |
|
חזרה למעלה |
|
 |
|
אורח
|
פורסם: ראשון 31.01.21 1:03 נושא ההודעה: Oversight Board overturns Facebook decision: Case 2020-006-FB-FBR |
|
|
Oversight Board overturns Facebook decision: Case 2020-006-FB-FBR
Janvier 2021
The Oversight Board has overturned Facebook’s decision to remove a post which it claimed, “contributes to the risk of imminent… physical harm.” The Board found Facebook’s misinformation and imminent harm rule (part of its Violence and Incitement Community Standard) to be inappropriately vague and recommended, among other things, that the company create a new Community Standard on health misinformation.
About the case
In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to COVID-19. The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorized and promoted remdesivir. The user criticized the lack of a health strategy in France and stated that “[Didier] Raoult’s cure” is being used elsewhere to save lives. The user’s post also questioned what society had to lose by allowing doctors to prescribe in an emergency a “harmless drug” when the first symptoms of COVID-19 appear.
In its referral to the Board, Facebook cited this case as an example of the challenges of addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.
Key findings
Facebook removed the content for violating its misinformation and imminent harm rule, which is part of its Violence and Incitement Community Standard, finding the post contributed to the risk of imminent physical harm during a global pandemic. Facebook explained that it removed the post as it contained claims that a cure for COVID-19 exists. The company concluded that this could lead people to ignore health guidance or attempt to self-medicate.
The Board observed that, in this post, the user was opposing a governmental policy and aimed to change that policy. The combination of medicines that the post claims constitute a cure are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription. Considering these and other contextual factors, the Board noted that Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards.
The Board also found that Facebook’s decision did not comply with international human rights standards on limiting freedom of expression. Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content.
The Board also found Facebook’s misinformation and imminent harm rule, which this post is said to have violated, to be inappropriately vague and inconsistent with international human rights standards. A patchwork of policies found on different parts of Facebook’s website make it difficult for users to understand what content is prohibited. Changes to Facebook’s COVID-19 policies announced in the company’s Newsroom have not always been reflected in its Community Standards, while some of these changes even appear to contradict them.
The Oversight Board’s decision
The Oversight Board overturns Facebook’s decision to remove the content and requires that the post be restored.
In a policy advisory statement, the Board recommends that Facebook:
Create a new Community Standard on health misinformation, consolidating and clarifying the existing rules in one place. This should define key terms such as “misinformation.”
Adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm.
Increase transparency around how it moderates health misinformation, including publishing a transparency report on how the Community Standards have been enforced during the COVID-19 pandemic. This recommendation draws upon the public comments the Board received.
https://oversightboard.com/news/325131635492891-oversight-board-overtu rns-facebook-decision-case-2020-006-fb-fbr/
|
|
חזרה למעלה |
|
 |
פייסבוק אורח
|
פורסם: ראשון 31.01.21 9:39 נושא ההודעה: Responding to the Oversight Board’s First Decisions |
|
|
Responding to the Oversight Board’s First Decisions
January 28, 2021
By Monika Bickert, Vice President, Content Policy
Oversight Board graphic
Today, the Oversight Board published their decisions on the first set of cases they chose to review. We will implement these binding decisions in accordance with the bylaws and have already restored the content in three of the cases as mandated by the Oversight Board. We restored the breast cancer awareness post last year, as it did not violate our policies and was removed in error.
Given that we are in the midst of a global pandemic, we feel it’s important to briefly comment on the decision in the COVID-19 case. The board rightfully raises concerns that we can be more transparent about our COVID-19 misinformation policies. We agree that these policies could be clearer and intend to publish updated COVID-19 misinformation policies soon. We do believe, however, that it is critical for everyone to have access to accurate information, and our current approach in removing misinformation is based on extensive consultation with leading scientists, including from the CDC and WHO. During a global pandemic this approach will not change.
Included with the board’s decisions are numerous policy advisory statements. According to the bylaws we will have up to 30 days to fully consider and respond to these recommendations. We believe that the board included some important suggestions that we will take to heart. Their recommendations will have a lasting impact on how we structure our policies.
We look forward to continuing to receive the board’s decisions in the years to come. For more information about what happens next in the process now that we have received these first decisions from the Oversight Board, please see the FAQ below.
Questions & Answers
Now that we have the Oversight Board’s decisions, what are the next steps for Facebook?
Since we just received the board’s decisions a short time ago, we will need time to understand the full impact of their decisions. We will update the Newsroom posts about each case within 30 days to explain how we have considered the policy recommendations, including whether we will put them through our policy development process.
Some of today’s recommendations include suggestions for major operational and product changes to our content moderation — for example allowing users to appeal content decisions made by AI to a human reviewer. We expect it to take longer than 30 days to fully analyze and scope these recommendations.
Are the Oversight Board’s decisions binding?
Yes. Today’s decisions (and future board decisions) are binding on Facebook, and we will restore or remove content based on their determination. The board’s policy recommendations are advisory, and we will look to them for guidance in modifying and developing our policies.
What will you do about content which is the same or similar to content which the board ruled on?
When it is feasible to do so, we will implement the board’s decision across content that is identical and made with similar sentiment and context. See below for more detail.
How many pieces of content are you taking action on as a result of the board’s decisions today?
We cannot provide a number right now since we are still reviewing the decisions. We’ve taken action on the individual pieces of content the board has decided on. Our teams are also reviewing the board’s decisions to determine where else they should apply to content that’s identical or similar.
Sharing More Details on How We Will Implement the Oversight Board’s Decisions
Infographic on how Facebook will implement the Oversight Board's decisions
How will a decision by the Oversight Board be implemented?
There are several phases to fully implement a decision by the Oversight Board. The board’s decision will impact content on the platform in two ways — through the binding aspects of the decision itself, and through any additional guidance or recommendation the board includes.
Case Content: As illustrated in the graphic above, we begin implementing the decision by taking action on the case content. Per the bylaws, we will do so within seven days of the board’s decision.
Identical Content with Parallel Context: Facebook will implement the Oversight Board’s decision across identical content with parallel context if it exists. First, Facebook will use the board’s decision to define parallel context. For example, if we see another post using the same image that the board decided should be removed, we may also remove that post if it is shared with the same sentiment. If the post has the same image but a different context (for example the post condemns the image rather than supports it) this would not be considered parallel context and we would leave it on the platform.
In order to take action on identical context with parallel context, Facebook’s policy team will first analyze the board’s decision to determine what constitutes identical content in each case. It will then determine the context in which the board’s decision should also apply to the identical content. One key element of the analysis involves reassessing the case content’s scope and severity based on the board’s decision.
Next, Facebook’s operations team, who enforce our policies, will investigate how the decision can be scaled. There are some limitations for removing seemingly identical content including when it is similar, but not similar enough for our systems to always identify it. The operations team will ensure that new posts using the content are either allowed to remain on the platform, or are taken down, depending on the board’s decision.
After this step is completed, we will update the case specific Newsroom Post with our follow up actions on this content.
Similar Content: Similar content means content that is not immediately impacted by the Oversight Board’s decision (the content is not identical or the context is not the parallel), but that raises the same questions around Facebook policies. If the Oversight Board issues a policy advisory statement with its decision, or makes a decision in response to a Facebook request for a policy advisory opinion, Facebook will review policy recommendations or advisory statements, and publicly respond within 30 days to explain how it will approach the recommendation. First Facebook’s policy team will review the recommendation from the board, and will decide if the recommendation should go to the Policy Forum for further review and to potentially change Facebook’s Community Standards or Instagram’s Community Guidelines.
After these steps are completed, Facebook will update the case specific Newsroom post and, in instances where the recommendation is considered at the Policy Forum, document the process in the Policy Forum minutes. Facebook has committed to considering the board’s recommendation through its policy development process. The policy development process is a large part of how we envision that the board’s decisions will have lasting influence over our policies, procedures and practices.
https://about.fb.com/news/2021/01/responding-to-the-oversight-boards-f irst-decisions/
|
|
חזרה למעלה |
|
 |
|
|
|
|