Don't wanna be here? Send us removal request.
Link
[…] For advocates of the legislation, the directive will balance the playing field between US tech giants and European content creators, giving copyright holders more power over how big internet platforms distribute their content. But critics say the law is vague and poorly thought-out, and will restrict how content is shared online, stifling free speech in the process.
[…] Two parts of the copyright directive are particularly worrying to critics: Article 11, known as the ‘link tax,’ and Article 13, dubbed the ‘upload filter.’ Article 11 lets publishers charge platforms like Google News when they display snippets of news stories, while Article 13 (renamed Article 17 in the most recent draft of the legislation) gives sites like YouTube new duties to stop users from uploading copyrighted content.
In both cases, critics say that these well-intentioned laws will lead to trouble. Article 13, they say, will lead to the widespread introduction of “upload filter,” that will scan all user content uploaded to sites to remove copyrighted material. The law does not explicitly call for such filters, but critics say it will be an inevitability as sites seek to avoid penalties. […]
Hey, EU pals. I know things look shitty right now, but no matter what the outcome of Article 11 and Article 13 are going to be like, remember one thing:
GO. VOTE.
Get these people out of their seats. They don’t know what they are doing, they called protesters actors, said petitions were signed by bots and call everyone lazy and disinterested in politics - because they had their asses safe in the chairs in the parlament.
They don’t realize they are not just hurting creative outlets, but also free speech, social media and eCommerce. Livelihoods depend on the internet. Europe’s economy and cultural landscape will suffer from it, as will other parts of the world.
Remember this. Be angry. And no matter what happens after this now: Stay angry. And go vote.
3K notes
·
View notes
Video
Smart techs allow people to control their devices more easier, but also make people (data owners) more easier to be controlled by data thefts. And people have done a lot to develop new and advanced smart tech to make life easier, but it seems that people therefore need to do more to deal with those consequences followed by new techs. People sacrifice long term security for instant convenience.
youtube
Internet of Things gadgets are vulnerable to the same takeovers as regular computers, but the consequences can be much bigger.
37 notes
·
View notes
Link
Information confidentiality is the primary issue of cloud computing technology. For example, user resources are shared by some enterprises. The leakage of information resources is unavoidable. If the technology is not sufficiently confidential, it may seriously affect the owners of the information resources.
How cloud computing can be beneficial for small and medium sized businesses. Read 7 financial benefits of moving to the cloud.
1 note
·
View note
Text
Algorithm-based notice and takedown mechanism
The notice and takedown mechanism presupposed by manual notification and manual review is the product of the web1.0 era. With the development of web2.0 and artificial intelligence technology, the algorithm can automatically send out notifications to automatically filter alleged infringing works.
Both manual and machine censorship have disadvantages:
Human censors are less efficient and more subjective than machines, while automated takedown requests may be of questionable validity, in other words, targeted content may not match the identified infringed work.
In addition, some people hold the view that automated censorship lacks due process and has a risk of over censoring.
So, automated takedown and human censor, which one is better?
Actually, according to some news, like Facebook plans to hire 3000 workers for reviewing videos, It can be seen that in practice, OSPs not only constantly improve their algorithms to review online content but also constantly recruiting human reviewers.
Zuckerberg, CEO of Facebook, said that artificial intelligence techniques would take “a period of years ... to really reach the quality level that we want”;
and Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring, said,
I don’t know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We’re just not there yet technologically.
The initial screening is performed using an algorithm and then handed over to a dedicated manual censor department. The algorithm can process massive amounts of data quickly, narrowing the scope of review and reducing workload.
In conclusion, both technology and professional reviewers are needed.
And to build an effective and legitimate NTD mechanism with algorithms, we should consider two aspects:
1. Algorithm Design
In the era of artificial intelligence, algorithms are widely used, and the number of notifications grows exponentially. If there are problems in the design of the algorithm, the erroneous notices are also massive, the consequences would be very serious.
Besides, when designing the censoring algorithm, it is necessary to determine a reasonable matching threshold to prevent both excessive and insufficient censoring, and also consider incorporating fair use cases. 2. Transparency: Disclosure Obligation
Unlike law enforcement by courts or government, the public cannot know the operation of the algorithm because they’re usually business secrets.
Moreover, NTD is a private regulation based on legal authorization, actually brings law enforcement functions to OSPs, and OSPs are usually private institutions that aim at maximizing profits. So, there may be abuse of power, which will affect fair competition and market innovation and will also undermine basic rights such as freedom of expression.
Therefore, to effectively regulate the abuse problem, the first step is to promote algorithm transparency. but proper transparency is not disclosure of the whole algorithmic program, but the notification information received by OSPs.
In the future, when the law is amended, the legal disclosure obligation of OSPs to publicly notify the notification from right holders should be added.
0 notes
Text
Remixed music on TikTok
According to Wikipedia, TikTok is a social media video app for creating and sharing short lip-sync, comedy, and talent videos. The Chinese developer ByteDance has previously launched Douyin (Chinese: 抖音) for the China market in September 2016. The application allows users to create short music and lip-sync videos of 3 to 15 seconds and short looping videos of 3 to 60 seconds.
The short videos that we see every day are generally divided into two categories. One is self-created, including short albums, short scenes, skill-sharing, and so on. The other is to edit and remix others’ videos and make a new one, including re-edited movies, reshooting and more.
Almost every video contains music, such as background music or cover version of existing songs. As Chris Eggertsen said:
“While TikTok bills itself as a video-sharing app rather than a music service, music has become a key component of its brand -- particularly as the platform has lately driven the success of a number of tracks.”
And people have different opinions about remix music in videos.
One of the opponents said like this:
“TikTok is built primarily around lipsynching to copyrighted audio, including content created by independent creators…. that’s privacy violation and copyright infringement, and that’s the real reason why you shouldn’t use TikTok.”
However, TikTok has little motivation to bear many burdens. The Digital Millennium Copyright Act of 1998 (DMCA) puts the onus on users rather than tech companies to report copyright violations.
And according to TikTok ‘s intellectual property policy,“We do not allow any content that infringes copyright. ” .
To be specific, people need to submit a copyright infringement notification if they found their copyright-protected work was posted on TikTok, then any user content that infringes on other person’s copyright may be removed, and their account may be suspended or terminated.
As for an online content sharing platform like TikTok, rather than other commercial music markets, I think the “Notice-and-takedown” Rule for remixed content is enough for copyright protection for the following reasons:
1. Many music works (such as “Steppin”, “Ride It”)become more famous after being adapted or remade. Some original authors do not intend to defend their rights, and even feel happy about someone else remixing their songs. When the right holder does not intend to make anyone be held accountable. does it mean that the problem is not serious enough to increase the obligation to the platform or user?
2. People usually use this platform for entertainment purposes, excessive intellectual property protection can hinder the creation of ordinary people.
Concluding, although copyright infringement occurs more frequently when the liability of the online platform or users is too light or the cost of infringement is too low, it is also the case that encourages the creation.
As Mary Rahmani, TikTok’s director of music content and artist relations, said in an email:
“TikTok empowers artists by being an avenue for visual output and creativity,” “We offer a platform that is creative, collaborative, global and unique.”
0 notes
Text
The principle of proportionality can be used for defining the liability of ISPs.
Article8(3) of InfoSec directive ensure that Rightsholders can make ISPs block websites involved in copyright infringement. This provision is mainly based on the fact that the intermediary service provider is in a good position to stop infringement because of its control over the website.
But blocking injunctions has also caused widespread controversy while effectively combating piracy.
The public worries that such measures will affect their access to information and hinder the development of the internet industry.
Isps believes that they should not undertake such supervision and management obligations for the following reasons:
1. the blocking junction disregards due process as intermediaries are not direct infringers 2. Internet service providers are also worried that if users are supervised, they will lose the trust of users, 3. The blocking measures have a limited effect on IP protection, as pirated websites can circumvent the ban by technical means. Instead, ISPs need to constantly take steps to block illegal websites and thus impact their normal business activities. Moreover, blocking costs time and money, which additionally increase ISPs’ operating costs, thereby undermining their business interests
The question here is how much ISPs should be responsible for their website operations and users, namely, as well as how to deal with the relationship between the blocking junction measures and the “technology neutrality” principle, as well as how to balance the interests between ISPs, users and right holders.
According to the ECJ’s ruling on Telekabel case in 2014, ISPs need to meet some criteria for blocking,which described by a blogger :
“in the sense that they must serve to bring an end to the infringement of copyright or related rights without affecting the accessibility of lawful information.”
I think the principle of proportionality in Chinese administrative law may help to further refine these two standards. the principle of proportionality including 3 parts: 1. Appropriateness. If the application of an obligation doesn’t help to contain copyright infringement, then it doesn’t meet the requirements of the principle of appropriateness. The judgment of appropriateness should be based on factors such as the difficulty of circumventing the blocking and the reduction of the overall network piracy rate. 2. Necessity. if there is a less restrictive way for ISPs to achieve the same effect as website blocking, then this less restrictive approach should be adopted. Necessity here means minimum damage. 3. A value-balance mechanism. Copyright protection is not the only purpose, other factors should also be introduced, Such as “avoid the creation of barriers to legitimate trade and to provide for safeguards against their abuse.”(Art.3, Directive 2004/48/EC), and “public interest” in the Copyright Amendment (Online Infringement) Act 2015, Section 115A.
In general, it’s important to impose certain obligations on ISPs through the principle of proportionality. Finding the best balance between copyright protection and the rights of owners/ users is still an unfinished task and still needs to be explored in practice.
0 notes
Text
When is software patentable? The Supreme Court is about to weigh in
When is software patentable? The Supreme Court is about to weigh in
Byron M. G. Sanford, Esq.:
The Alice Corp. case definitely represents a potential pivot point for software patents. The determination of what what is and abstract idea and what is not in the context of computer software has long been a difficult and fuzzy process. It is hoped that the U.S. Supreme Court will use its decision in the Alice Corp. case to clarify that analysis, thus providing clearer…
View On WordPress
1 note
·
View note