The proceeds from Naruto’s photo should be used to help endangered monkeys.
That statement alone shows a huge amount of bias. Just for a second, let's assume that "Naruto" has all the rights and duties of a natural person. So, yes, the copyright would fall to the monkey. Not PETA, the monkey.
Now, first off, what proceeds are you talking about? If the copyright was assigned to the monkey what would happen is simply that nobody could use the photo in question. Or do you really, truly, actually think that Naruto would then go on to personally license the photo to a rights management organization?
Secondly, why the flying frak are you assuming that he would want his earnings going to help endangered monkeys? Maybe he just wants to buy some shit.
Back to reality.
On the one hand, you're trying push this narrative that the monkey is sentient and is capable of independent action. At the same time you're shoving your own biases and wants down its throat, treating it as though it cannot make rational decisions.
I'll concede the monkey has the rights to the photo just as soon as it directly claims those rights. PETA is not trying to get the rights to be assigned to the monkey, they're trying to get the rights assigned to PETA, "on behalf of" the monkey. If that's true, they should have a contract with the monkey's signature, since we're all trying real hard to pretend that the monkey has magically become sentient.
Has PETA even been in contact with the monkey? Are there communication records? Has it expressed interest in this case to the press?
You have no knowledge of recent history, do you? You sit here enjoying the fruits of the Computer Revolution, yet are gleefully trying to clamp down on the rights and abilities that brought those fruits to bear.
Ok, I guess I need to give a quick history lesson.
The Dark Ages
In the '60s and '70s, personal computing was a laughable pipe dream. This is despite the commercialization of the silicon transistor in the '50s. Computers were proprietary mainframe/terminal setups, and cost exorbitant sums of money. This was because each seller had to build their system from the ground up, hardware and software. Now, as we approach the '80s, hardware costs have started to go down, but software costs were going up. Companies still had to write the complete codebase for their proprietary system. Compatibility was unheard of, and costs were still too expensive for personal computing for the general public. Hobbyists could put together relatively cheap kit computers, but the retail desktops cost $5000-$10000, adjusted for inflation.
Enter "Open Architecture"
The early commercially successful computers, the Apple II and the IBM 5150, both utilized a published, card-based, open hardware architecture. This meant that any company could follow the spec and produce hardware components compatible with the machines, and allowed third-party software to enter the mainstream. Now, instead of using whatever proprietary word processor came on the machine, you could run WordStar, or any other software. This meant that the computer manufacturer didn't need to develop all that software in-house, lowering the cost of the machine.
The Clone Wars
It's the early '80s, and home computing is starting to take off. IBM dominates the market, but they're still too expensive for most households. Still, they've built up an ecosystem of third-party software that consumers demand. "Does it run Lotus 1-2-3?" is the death knell of many a new entrant. Things look bleak for everyone but Big Blue.
Then inspiration strikes. IBM's machines ran PC-DOS, provided by a small company called Microsoft. Microsoft also sold the OS, as MS-DOS, to any interested third-party. Some companies try to break into the market by using MS-DOS, but differences in the BIOS mean that programs needed tweaking before they could run on each machine.
Compaq wants to build a fully-compatible IBM Clone, but they can't just copy IBM's BIOS due to Copyright law (See Apple v. Franklin). They could, however, independently create their own BIOS that behaved identically. They proceeded to do a clean-room reverse engineering of the IBM BIOS, and built the first true PC clone. When IBM's lawyers could do nothing to stop Compaq, the floodgates opened. The new competition enabled by these "knockoffs" drastically lowered the price of computing hardware, bringing about the commoditization that we enjoy today.
That's only a brief overview, there's much more to the story, and I encourage you to read up on it. It should make clear the pivotal role that reverse-engineering and third-party compatibles had in bringing about ubiquitous computing, though.
I get where you're coming from, but your suggestions are at odds with your ideals. VAT structures are a mess to actually implement. Suddenly the government needs to know the value of every good and service, and entities selling goods or services need to navigate a quagmire of tax classifications in order to figure out how much they need to pay back in taxes. There's no way you're fitting that in 100 pages. If anything, it would end up more complicated than it already is.
Not to mention that government agencies (State and Local mostly, since there's no Federal sales tax), already have trouble keeping track of sales taxes. Did you know that in most States, if you buy something online (i.e., Interstate commerce), you're supposed to pay sales taxes to both states? The retailer will take care of the tax for the other State, but you're supposed to self-report the sale to your own State. How many people actually do this? And let's not even get into cash transactions...
No, if you want a simple tax structure, you need to base it on money coming in, not going out. Then, you need to provide "deductions" for things you want to tax less, such as health care. That's when the tax code becomes complicated. There's no easy answer, I'm afraid.
The futurist in me would like to see the Department of Commerce implement a universal electronic funds system, available to every individual or business, theoretically obviating the need for cash, allowing automatic calculation of taxes, and cutting into financial crimes such as fraud and money laundering. In the real world, though, I'd be terrified of that system, the reasons of which are aptly described by this article.
If this attitude had been prevalent in the '80s, this technology which you seem so keen to protect would simply not exist. We'd still be dealing with proprietary black box systems, and innovation would have been slowed to a crawl.
Reverse engineering can be used to develop a part that is indistinguishable in operation from a "licensed" part. This right needs to be protected and championed.
It could be agreed upon by the parties involved to offer the choice of having ESPN or not to the consumer.
Comcast is bleeding from a thousand cord cuts. They need to retain customers. They can do this by lowering the price point of their service (basic economics). A good way to lower price is to more efficiently meet customer needs, i.e., lowering costs. Comcast could save money by splitting off ESPN to an optional package. However, ESPN currently has a contract which forbids this...
ESPN is also feeling the crush of innovation. The good news is that demand for Sports hasn't gone down (at least significantly). As was touched on in the article, they were riding high on the cable bubble, but reality has come rushing in, and it's time for them to tighten their belts. They can't offer a streaming service because it would invalidate their existing contract, turning a relatively stable glide to the ground into a nose dive. They need that lifeline in order to buy time to adapt. However, if they keep strangling Comcast with it, neither company will survive. So, ESPN could strike a deal with Comcast, allowing unbundling of the channel in exchange for either an upfront capital infusion to allow them to move forward, or for striking the provision regarding streaming services from their contract.
Applauding T-Mobile for this move is like thanking a guy for only punching you in the face once. "Hey, I was going to punch you in the face twice, but I changed my mind. That guy over there? He'd punch you in the face three times. You made the right choice."
Clearly I have the copyright on punching based analogies as applied to Binge-On.
You'll be hearing from my lawyers.
(It's a good analogy, not surprising that parallel construction happens.)
Except if they were concerned about peak bandwidth usage exceeding alotted capacity, data caps are a poor way to manage that. Data caps attempt to manage congestion through secondary effects, and they do so very inefficiently. The best way would be to simply implement basic network QoS, and throttle down users when a node is about to exceed allocated capacity, preventing balooning costs from rare usage spikes. You can even do this "fairly" by first throttling users with a heavier usage profile; those who contributed most to the average network burden would be those who feel the effects.
I'm starting to think these companies know that they're on a sinking ship. If you accept that cord cutting is inevitable, their behavior does make a perverse kind of sense.
1. Over the next decade or so, all media consumption will be done over IP streaming. No one will continue to pay for cable. 2. Fiber to the home will become the standard method of connectivity for urban/suburban population centers, making the cable infrastructure meaningless.
1. Assumption 1 indicates that a primary revenue source will dry up, regardless of their actions. 2. Assumptions 1 and 2 together mean that a large amount of capital infrastructure is about to become worthless. 3. Because of point 2, if they wish to continue to compete (long-term) in the ISP market, they'll need to procure a huge amount of capital expenditures. 4. Because of point 1, they will not be able to rely on their other service to fund the necessary infrastructure changes.
Conclusion: In order to remain competitive long term, these companies need to spend a large amount on infrastructure. Even if they do so, however, there is no guarantee that a suitable RoI will follow. Competitors such as Google Fiber make the market uncertain.
However, there previous capital expenditures have not yet been completely devalued. By gradually raising prices and cutting costs, they can drain every last cent from the business before it inevitably goes under. By taking the funds wrung from the dying business and investing it in a different market, they can get a better return than if they had tried to pivot their business.
This is why Omnibus bills are frakking terrifying. Anything that can get shoved into that monstrosity will effectively become law without any meaningful debate, and almost zero chance of being voted down.
Congress desperately needs to make some procedural changes to the legislative process. One of the most badly needed changes is to do separate votes for each attachment to a bill. That way, even if someone attaches a rider to a "must-pass" bill, the bill itself can still pass even while the rider is voted into oblivion.
Seriously. Doing some rough calculations, this whole ordeal cost them in the ballpark of $2500. (And that's assuming they have proper design and marketing tooling, I sincerely hope they didn't have to modify each of those images by hand.)
They don't seem to be a crazy tiny company, but that's still a non-trivial cost they had to bear, because... reasons.
"That said, high speed Internet access is optional."
Even that's highly debatable, depending on your employment or other communication needs. I certainly couldn't do my job or personal communication as effectively from a remote location if my apartment and those of friends/family didn't have broadband.
Mm, true. There are plenty of edge cases (especially for those who work in the tech sector) where people need a reasonably fast connection in order to, y'know, continue to feed themselves.
~90% of people (obviously a number pulled from the aether) don't need more than basic web browsing/email, though.