Of course I didn't tweet during the olympics. I work for a living! I watched what few highlights they had of curling on in the evening (because I couldn't get the full games like I wanted), and that was it. It's your own fucking fault NBC. It's not like people haven't been telling you this. I also don't tweet every time I have a bowel movement or eat a scone, nor every time I watch some mundane television program.
Ah yes, a trillion dollars freaking wasted. Better spent burnt in a fireplace for warmth than bogus litigation. Even if only 10% of that money would have wound up spent on real innovation... Hell, even just 1%... My god!
You know what your DRM has done, Square Enix? It's prevented me from being a customer. I don't pirate things -- I buy them. But I don't buy shit from people who plan to treat me like a criminal or cripple my purchase with shitty DRM. Just like I refuse to buy things from EA, Sony, and Ubisoft. So bite me. More money for the artists and developers I care about that don't equate me with a machine that shits dollar bills into their hands on demand.
It really depends on a person's outlook on comments. It depends on if you can find any obvious bias in the story, any sources for data in the story, or any sources in the comments to prove the story is wrong or faulty. And on many sites, comments are youtube level failure. Like on reddit, which I browse from time to time, I always read the comments because undoubtedly somebody has posted a very good rebuttal of why the article or study is flawed or sensationalist, or somebody in the field explains in better terms why it is important or what the applications are. And then there's the bias of if you are reading about something in your own field. That's a lot of factors to control for, just right there!
I have to disagree. Setting aside the fact that most places won't look at your resume if you don't have a degree, I do feel there is a balance of school vs. real world experience. I have seen bad programmers that are self taught and that came from colleges. The good programmers in college were the ones who mostly started on that course well before they got to college, and don't just do programming for class, they also do it in their spare time for hobbies and tinkering and such.
I find that I have learned a lot in the past 5 years on the job, but I find that what I learned in college (especially the six semesters of advanced math and various data structures and algorithms and software engineering and networking and operating system courses) has been very critical to my success in the field. It's all very subjective depending on what type of development you're doing and how much of your time is spent writing code versus doing R&D or algorithm development or fine-tuning or anything of that nature.
Being a good developer is about more than just being able to sling code. (Almost) anybody can cut and hack a pile of shit together with some proverbial duct tape and baling wire. But there is also no substitute for real time, on the job training. It takes a very committed self taught person to follow good programming habits and procedures when tinkering by themselves versus the skillset of larger group collaboration efforts.
[snip]how in light of advancements in communications technologies, the United States can employ its technical collection capabilities in a manner that optimally protects our national security and advances our foreign policy while respecting our commitment to privacy and civil liberties, recognizing our need to maintain the public trust, and reducing the risk of unauthorized disclosure[snip]