text
stringlengths
44
950k
meta
dict
Umberto Eco, the Art of Fiction No. 197 (2008) - samclemens http://www.theparisreview.org/interviews/5856/the-art-of-fiction-no-197-umberto-eco ====== matthewmcg My favorite part (Foucault's Pendulum is a fun book too): INTERVIEWER: Have you read The Da Vinci Code? ECO: Yes, I am guilty of that too. INTERVIEWER: That novel seems like a bizarre little offshoot of Foucault’s Pendulum. ECO: The author, Dan Brown, is a character from Foucault’s Pendulum! I invented him....I suspect Dan Brown might not even exist.
{ "pile_set_name": "HackerNews" }
Mountain Lion's new file system - speednoise http://informationarchitects.net/blog/mountain-lions-new-file-system/ ====== eblume I couldn't get past this part: Even geeks can’t handle folders in folders Hierarchical file systems weren't invented because it was ancient times and they were the easiest thing to produce. Hierarchical file systems were invented because they are a really, really good paradigm for storing and retrieving hierarchical data - which I strongly believe is still the case for the majority of files. I didn't read further in to the article so perhaps I missed where this same idea was discussed, but I think a useful idea going forward would be meta- tagging of data. Something very basic like the ability to flag a file and later search or sort by those flags. Do it for specific folders, like the Documents folders. Done. Now the 'non-geek's can deal with flat file structures and us 'geek's can do smarter things. ~~~ stcredzero _> Hierarchical file systems weren't invented because it was ancient times and they were the easiest thing to produce._ Correct. They were invented because it was just past the middle of the 20th century, and they were the best organization mechanism for the resources available. They're also very robust, versatile, and wonderful in innumerable ways, especially for the audience at the time. Unfortunately, they're not the best thing for most users today, when we have more computing power and experience with search as an organization tool. ~~~ johncoltrane Search is the opposite of an organization tool. It's a tool that allow users to avoid being organized. ~~~ stcredzero Or, it's a tool that allows users to have organization, without needing to implement it themselves. (After all, the indexes of the search mechanism clearly have a high degree of organization.) ------ crazygringo > _Folders are not a feature that beginners muddle through but pro users > require. No one can deal with deep folder structures. Our brain is simply > not built for them._ This is ridiculous. I have computer files I've produced from over the past 20 years. As time goes by, I continually archive them in a big old hierarchical folder called "Archive". Broken down my life phase, school year, project, etc. Without folders-inside-of-folders that whole system would be a _mess_. Or, programming projects. How can you even imagine organizing your stylesheets, plugins, libraries, components, endpoints, controllers, etc. without folders? I'm sorry, but throwing away hierarchical folders completely is a monumentally stupid thing to advocate. Beginners might be better off without them, but pro users _absolutely_ require them. And our brain is most _certainly_ built for them: <http://en.wikipedia.org/wiki/Memory_palace> Now, I'd love to have tags as well, but "keep your hands off my folders!" :) ------ revelation Coming from Linux, you would think "new file system" would mean that: a new file system. But its just the same smoke and mirrors Microsoft did when they scrapped WinFS: add magic folders, terrible ideas (translated folder names?) and missing hard features (links) in a limited fashion. Thats sad, because the file system could certainly do with a complete makeover: metadata, builtin sync with the cloud, backup and encryption as a first-class citizen, etc. etc. For now, thats only available in enterprise solutions like ZFS or in the still very alpha btrfs. ~~~ Lewisham My feeling is that Microsoft is much closer to a post-files world than Apple is. Apple thinks files should stay within the app, which is exactly the opposite of why we had consumer-viewable file systems in the first place. We want/need to be able to edit files in different programs. Windows 8's Contracts, combined with system-wide Share, and storing everything on Skydrive (so when you log in to another Windows 8 machine with your Live ID, it's all there) is a much more rounded vision for a post-file world, one where you have documents which are shared around, rather than opening apps then pointing at files. ~~~ shock3naw And once uploaded, Microsoft will inspect your files and ban/suspend you from SkyDrive, Hotmail, Xbox Live, and the Marketplace if they don't like something. [[http://wmpoweruser.com/watch-what-you-store-on- skydriveyou-m...](http://wmpoweruser.com/watch-what-you-store-on-skydriveyou- may-lose-your-microsoft-life/)] ------ drcube This is bullshit, in a lot of ways. 1) This isn't a file system. It's a file structure. 2) Presumably I can still make directories inside of other directories if I want to, right? Given that this isn't an actual "new file system", they can't stop me, right? This was totally unanswered by the article. 3) If they actually prevent heirarchical file structure, can OSX still be called Unix? I think you need root, /usr, /dev, /etc, et cetera. 4) In all my years of dealing with noobs, I've never seen anyone flummoxed by a heirarchical file setup. Yeah, when they first start out, many people open files by clicking the program and choosing "open". But it usually isn't long before they want to start transfering files between devices, folders and programs. And if you need to do anything like that, you need a file browser and heirarchical structure. I guess I've read it enough to believe it, but I can't. How can computer professionals actually suggest it should be like this? For _everyone_ and not just some training wheels I can take off? ------ rflrob I find the claim that _nobody_ understands multi-level file systems a hard to swallow. Inside my Documents folder, I have a folder for one of each of several organizations I work with, inside of those a set of folders for one of several different things I do for those organizations, and within some of those, a folder for every project I have going on. I don't necessarily need to know the complete path to everything, but anyone can generally look at the directory listing and see what the next directory to look into is. ------ trotsky Is it a symptom of the fact that he believes himself to be an expert on the subject that he enjoys using the term "file system" over and over in a way that goes against 50 years or more of common usage? Or was it just link bait for those of us who thought there might be some new HFS+ implementation? ~~~ novalis I started reading the article expecting to find some sort of description for a new file system and couple of paragraphs in I discovered that it reads like the how to frame yourself as a opinionated clueless twit manual of the year, I am going with link bait. Priceless link bait. ------ iandanforth Information architecture from a neuroscience point of view. Concept building: A bottom up hierarchy Everything you know was learned in the context of prior knowledge. You combined prior experiences, refined them, and over time solidified them into new constructs which you then used to repeat the process for higher level concepts. Visual light and dark blobs become coherent shapes which get associated with meaning and eventually those meanings get associated with names like 'chair' or 'rabbit.' Content access: A top down (and sideways) lookup. In many cases we know what we want without bothering to think of the name. Being thirsty may mean you think 'glass','faucet','stream','bubbles', and 'water.' But you don't have to think any of those words to get a glass of water and take a drink. Unfortunately in much of IT there is a textual interface so you first have to pick a starting word and then use that to find what you're looking for. Any time you think of a word it is physically connected to many other notions, memories, experiences, and other words built over a lifetime of experience. When we try to describe what something _is_ we access the hierarchy we have learned. A chair is a piece of furniture, which is a solid object, which is something I can touch. We may even use this description to create a hierarchical description. The tension here is that how we describe things is at best a very limited subset of how our brain connects to information about that thing. These mappings differ from person to person and change over time. One system of hierarchical categorization cannot be intuitive to all people and probably won't be to the same person after 10 years. IMNSHO The only interface which will 'just work' for organizing labeled objects is one that _knows you_. The best example of this is Google's one box which searches my computer, builds associations between content that are not directly related to the search terms I use, and modifies itself over time based on my behavior. ~~~ awongh this. OP should keep in mind that any user interface is built upon _learned_ , _flexible_ metaphors for the data it represents... files and folders are a metaphor that break down for some use cases and work well for others... I always have a hard time taking anyone seriously who thinks that these metaphors can be fundamentally improved. A metaphor that's intuitive for one person could be completely confusing and out of context for someone else. ------ kilemensi This argument does not make sense at all. Almost everything we do or own in the real world is based on one form of hierarchy or another. Work (CEO > Executive > etc.), Home (Parents > Old Siblings > You > etc), House (House itself > rooms > closets in a room, etc.) These are not just labels and there is nothing geeky about them either. They imply a certain order or sequence of things that can not just be moved around. If there is one thing that we as people are good at, it must be hierarchies. Tags on the other hand do not imply order of things. They are more about how you'd describe or how you relate to these things. For example you can have a 'favorite' tag and apply it both to your sibling as in 'your favorite sibling' and one of your closet as in 'your favorite closet in your room'. It says nothing about the order of these things just that your like them. Also, tags can be somewhat temporal as opposed to hierarchies. For example if we take a folders and files example, I can have a top level folder called projects which contains sub-folder for each project I've ever worked on. In each of these sub-folders I can then store the project-specific files. I can use 'current' tag to tag the project I'm currently working on. When I finish this project and get another project, I then remove the 'current' tag from the just finished project and move it to the new project leaving the hierarchy intact. The point is not to use folders when you need tags or use tags when folders are required. The best files system would be the one that allows you to use both as situation demands. ------ DannoHung When is someone in charge of such things for an OS vendor going to realize that for any of "my" documents, what I really want is to tag them? ~~~ jgeorge I would jettison every piece of computing hardware I own right now and switch to the first device that would let me add (and search) tags in file metadata. ~~~ justincormack I think there are some Linux solutions for this using extended attributes (which are by the way pretty easy to use). ~~~ simcop2387 And one that doesn't, and is AFAIK incompatible with others. Nepomuk from KDE does this with a separate database, I believe this is mostly to allow it to work when xattr isn't available (say for some network filesystems). I've not actually used it much myself, (there's far too many files for me to even begin doing it.) ~~~ justincormack Most Linux distros do not enable xattr support at all by default, which is where the database thing came from. Very annoying. ------ rsync So ... what happens if I drop to the terminal and: mkdir -p /Users/username/folder1/folder2/folder3 ... and then open the finder and navigate to my home dir ... do I just not see past the first nesting ? Is it hidden ? ------ amitparikh Windows did it first... My Documents / My Downloads / My Movies / My Music / My Pictures. The author makes it seem like Mac's "default content folder" scheme was a novel idea. ~~~ Lewisham [citation needed] I'm fairly sure that Mac OS X was here first, and it was copied to XP. IIRC, I was running Windows ME at the time of Mac OS X, and ME did not have those default folders. ~~~ akshaykarthik <http://en.wikipedia.org/wiki/My_Documents> Windows has had a "My Documents" folder since win98. ------ liquidzoot I really don't understand what is being said here. How are folders a hard concept? ------ russelluresti I'll have to see what this is like by actually using it. Right now, my biggest concern is that it usually makes for better usability if you allow the user to organize and group content in a way that makes sense to them. I don't feel this is an area where the OS should take control away from the user. ------ iioowwee Wasn't the original Apple "filesystem", circa II, a flat one? That is, a list. Not only is it easier to work with, as the article suggests, but, obviously, it's faster! I use globbing on a daily basis over other later approaches, e.g. regex. But it works best with a relatively flat filesystem. Too deep, and it's off. The simplest approach possible. But before we can fix filesystems, maybe we need to teach people how to name files in ways to make their life simpler. Long filenames and ones with spaces and punctuation inevitably become a huge PITA. Yet we think of these as "must- have" conveniences. I used to believe that too. But over time I've realized this slows things down immensely and introduces lots of unneeded complexity. Speed and simplicity is more important. If you can get by with only a "list of files", you are better off. ------ LaSombra I think funny that it's easier to dumbi-fy the user instead of creating something worth some like tagging and making metada easier and more useable. EDIT: Also, I think he's referring to iCloud only. ------ jmulder Folders or tags are hard in the sense that both of them require a user to think about organisation, when in reality any of them just wants to find the file 'magically' where they last left it and do stuff with it. In most cases this is in some application or (in my opinion still a temporary stop gap) a one level deep file system. The applications itself will serve the needed context or metadata (type of document, last modified time, access/sharing privileges) to find the files you're looking for. ------ callil For some reason I think the author is being overly wordy. The point if I understand it, is that Apple is once again moving OSX towards iOS by switching from a strict folder based filesystem to an App structured file system (from the user's point of view). This allows people to find content and files more easily because they will always be where they left them. In the app that uses them. ~~~ LaSombra I almost would not mind this if I could open files in new programs like the Share feature of Android. This is one of the reasons I dislike iOS and barely use my iPad for anything but comics and web reading. ------ crazygringo > _We are the people that put salt and pepper on the pizza before trying it, > because we just know best._ Salt and pepper... on _pizza_? Is that a thing? ~~~ drcube I put pepper on pizza. If you have meat or cheese on your pizza, adding salt is just too much (for me). ------ quote I guess we'll have to see how this holds up: Usually the problems arise not from one's own folder structure (most people know their way around what they've created) but when working together in someone else's structure. Does this approach solve that problem? ------ stcredzero _> But that’s not the real reason why geeks are skeptical. It’s because we are smart asses. We are the people that put salt and pepper on the pizza before trying it, because we just know best._ I couldn't have put it better. It goes double for us coders. ------ mbq I'm sure this will add a lot of awesomeness to tasks like creating a web page or an application; all the pngs and jpegs mixed together in Photos, templates and styles in Documents... ------ 1234567bob Nothing to do with file systems, got bout half way before I realised this was just some tool talking about how hard it is to keep his shit organised.
{ "pile_set_name": "HackerNews" }
Microsoft, RedHat, IBM, Docker, Mesosphere, CoreOS and SaltStack join Kubernetes - bgoldy http://googlecloudplatform.blogspot.com/2014/07/welcome-microsoft-redhat-ibm-docker-and-more-to-the-kubernetes-community.html ====== ifup The cool thing that is that we have a number of companies contributing significant technologies to the open source ecosystem that build a stack of software that gets us closer to running distributed systems in a reasonable reproducible manner: \- Google is bringing kubernetes (k8s) which represents their experience in deploying cluster wide applications \- CoreOS is bringing etcd to the table for the cluster wide decisions in k8s \- Docker is bringing a format that makes getting your applications isolated and running quickly ~~~ presspot \- Mesosphere is bringing Kubernetes on Mesos, which will give you a top-to- bottom stack that approximates Google's Omega/Borg at scale. [http://mesosphere.io/2014/07/10/mesosphere-announces- kuberne...](http://mesosphere.io/2014/07/10/mesosphere-announces-kubernetes- on-mesos/) ------ djb_hackernews crickets from VMWare/EMC. Docker/containers will eat their lunch if they don't jump in and get involved. ~~~ wmf If KVM didn't eat their lunch already, why would Docker be different? ~~~ tux1968 Because it's a new paradigm, not just a competing virtual-machine implementation. ~~~ evol262 It's not a paradigm which even remotely threatens VMware's use case, though. ------ contingencies TLDR; Kubernetes is basically like a local copy of a specific-configuration cloud provider that uses docker. It's also Google Cloud Platform's basis, so developing against it lets you deploy your code there. As far as software goes, it's very immature/early days. Some of the pertinent architectural limitations that Kubernetes appears to have are: limited range of target OS platforms for services to target, non-standard mechanism of service relationship abstraction (read: lock-in warning), immature security model, limited support for complex network topologies (eg. hardware switch management), fixed approach to cluster scheduling/consensus. PS. Corrections welcome, I'm just trying to help people get a grasp without bothering with the background reading. ------ jwildeboer Interesting. No Canonical/Ubuntu. ~~~ tormeh They're Python-fans, I believe. EDIT: This used to be their recommended way of making apps: [http://arstechnica.com/information- technology/2009/08/quickl...](http://arstechnica.com/information- technology/2009/08/quickly-new-rails-like-rapid-development-tools-for-ubuntu/) ~~~ thejosh They use go to build juju. ------ sidcool It's great so see tech giants going along well for technical growth. ------ donniezazen Google's open source investment hugely astonishes me but as far as desktop is concerned they are also hugely oblivious and ignorant (Yes, I am talking about Drive for Linux). ------ ihsw These are massive names using Go now. This is an exciting time for Gophers. ~~~ thescrewdriver I'm curious how a vague cheerleading-your-favourite-technology comment became the most upvoted in this discussion. ~~~ gulbrandr I think "This is an exciting time for Gophers." is the reason. ~~~ stephenr Gophers? Thats what developers who use the Go language call themselves? What's with the ridiculously bad naming/branding in the tech world? * Gophers (the animal) are considered by many to be a pest. * The Docker logo is a whale carrying shipping containers on its back. Shipping containers that go into the ocean are basically unrecoverable/not worth recovering, and whales spend very little of their time on the surface (meaning all the containers will go into the ocean) This is as ridiculous as having an airline named after an animal that cannot fly and kills people. ~~~ jahewson That would be Tiger Airways [https://tigerair.com](https://tigerair.com) ------ miles932 Am I the only one to notice that if you subtract the k-es from the name, kubernetes becomes UBERNET.
{ "pile_set_name": "HackerNews" }
ICloud is down? - swalberg https://www.icloud.com/?down=yes ====== dfc <http://isup.me/icloud.com>
{ "pile_set_name": "HackerNews" }
Time to NonGoogle Everything – A Call to Migrate to More Private Search Engines - dandelion_lover http://www.reddit.com/r/privacy/comments/26jdjt/its_time_to_nongoogle_everything_a_call_to/ ====== JetSpiegel The real link is [http://pastebin.com/tH5hXU1D](http://pastebin.com/tH5hXU1D)
{ "pile_set_name": "HackerNews" }
Pro-Tesla electric car bill advances in NJ Assembly - DiabloD3 http://www.nj.com/politics/index.ssf/2014/06/pro-tesla_bill_advances_in_nj_assembly.html ====== dave1619 The bill hasn't passed yet. It was approved by the Assembly Consumer Affairs Committee but still needs to be voted on by the legislature. According to [http://www.engadget.com/2014/06/06/tesla-resume-sales-new- je...](http://www.engadget.com/2014/06/06/tesla-resume-sales-new-jersey/) : "The bill will need to pass a few more of New Jersey's legislative processes to become law, but things are looking up for Tesla." Poor reporting job by TechCrunch ([http://techcrunch.com/2014/06/06/tesla- wins-back-the-right-t...](http://techcrunch.com/2014/06/06/tesla-wins-back- the-right-to-sell-direct-to-consumers-in-new-jersey/)). They made it seem like the bill was voted into law. ~~~ maratd Not only has the bill not been brought to the floor, it would need to be approved by both legislative bodies and then signed by the governor. The probability that the dealer lobby will sit idly by and let this pass ... is unlikely. Regardless, you can buy a Tesla in New York, Pennsylvania, or online. If you go to a Tesla showroom now, you'll see a huge sign that says "pay 0% sales tax", which makes sense if you buy it online. Why would I want to pay 7% more even if I had the option? All of this is just bullshit, on both sides. The dealers already lost. They lost when buying cars over the internet became a viable option. ~~~ cpwright The pay 0% sales tax is surprising to me. At least in New York whenever you register a vehicle, you pay the sales tax; no matter what state you bought it in. Even if you didn't have to pay the tax to register, my guess is that the NJ law would be written as a sales and use tax; and you would be responsible for it anyway. It wouldn't exactly be hard for the state to figure out who has a Tesla with registration records, but who had neglected to pay the sales tax and slap them with additional fines and penalties. ~~~ lectrick Just paid 8.25% sales tax on a Tesla in NY (Long Island). OUCH. (vs. 0% in NJ!) ------ geofffox This seems like a 'one off' deal. Tesla can come in, zero emissions exception, but conventional competition against the franchised dealer model is still prohibited. This is solely because of embarrassment, but it doesn't right the real wrong. ~~~ bsilvereagle If I remember correctly, a lot of deals Tesla has struck with other states are "one off" deals just for Tesla. ------ afternooner Question for those who understand this, why can the government force a middle man to exist in the first place? It seem to a layman to violate several other rights. ~~~ URSpider94 Car manufacturers created the franchise system as a way to avoid them having to carry the cost of vehicle inventories and repair facilities across the country. Having done so, the dealerships lobbied the state legislatures to formalize their status in law, so that they couldn't be disintermediated by the manufacturers later. I believe that the argument in their favor, other than blatant self-interest, was that dealerships provide post-sales warranty service, which would otherwise not be readily available. In some states, IIRC, the statutes are written to say that vehicles can only be sold from a location that also offers warranty service. ~~~ Shivetya If Tesla attempts to prevent independent service facilities from providing warranty and service work then be very worried. Laws exist to give consumers the choice for a reason. With multiple dealers at least you can choose which one to buy from, who has better people, better service, and better pricing. Why people want Tesla only stores is beyond me, one choice means no choice. ~~~ Crito It's not about wanting only Tesla stores. It's about wanting Tesla stores at all. If consumers really do value dealers, then dealers will continue to exist. The reason that dealers are scared is that the dealers themselves do not believe that they provide a valued service. ------ esbranson > _... would allow a start-up electric car maker, like Tesla, a reasonable > period of time to ramp up operations (or sales volume) before they conform > their business operations to the franchise model, " [president of the New > Jersey Coalition of Automotive Retailers Jim Appleton] said._ > _" Should there be some kind of time limitations within the bill or are we > setting the stage for a situation where the historic dealership model is > going to be hurt 10, 20 years up the road?" [State Sen. Brian Rumpf > (R-Ocean)] said._ Proof positive that the "captured agency" in regulatory capture[1] is actually the entire government, or at least the entire Republican Party. Government officials don't even try and hide that they're opposed to hurting established industries. The only difference is that in the 21st Century its not a railroad and oil company they're constantly trying to shield from competition. [https://en.wikipedia.org/wiki/Regulatory_capture](https://en.wikipedia.org/wiki/Regulatory_capture) ------ Ryel Sorry for being a bit off-topic but I'm fascinated by Elon's boldness in claiming that hydrogen fuel-cell tech is a dead end for automakers even though they are doubling down in it's investment. I would love to be knowledgeable enough to have a reasonable opinion on this topic. Could anybody recommend reading material where I could learn about some of what Elon is doing, what automakers are doing and what the pros vs cons are of both without getting too intimidated by the science behind it? ------ afternooner Why can the government force a middle man to exist in the first place, that doesn't seem in line with capitalism at all? ~~~ dodders Any form of government intervention in the markets is out of line with pure capitalism, e.g. farm subsidies; import taxes; bailouts of the auto and finance industries; medicare; medicaid; social security... one could make an argument for Fannie Mae and Freddie Mac as well. That's not to say that the above are not reasonable policies for an advanced western economy, just not that they could reasonably be considered purely capitalist. ~~~ esbranson No, there is no "purely capitalist". You confuse "laissez-faire capitalism" with "capitalism". If what you say is true, "laissez-faire" would be a superfluous adjective, like calling it "capitalist capitalism". ------ rodandar Can anyone explain to me why Telsa should get an exemption from the law and not other car manufacturers? Seems to me - that everyone should have to play by the same rules. I'm a Tesla fan, and I'm not a fan of car dealers, I just don't understand why Tesla should get an exemption just for them? Nobody seems to want to talk about this....thoughts out there other than: government should support electric cars? Should GM be able to sell the volt direct, but not other cars? Someone please explain! ~~~ esbranson Yes, that is what they're saying: GM be able to sell the volt direct, but not other cars. Every company would be treated the same ("play by the same rules"). They can all sell zero emission vehicles direct to consumers at a maximum of four locations in NJ. Why does Tesla get this exemption? Because corruption is only acceptable by the population is everyone is doing it. In this case, NJ is one of only a handful of states that is blatantly protectionist of this established industry. ------ adventured Is there data in the wild regarding how much less maintenance a Tesla S needs than a comparable gas-powered luxury sedan? Logically I believe the premise, and I know Tesla has discussed this numerous times over the years, but I haven't come across any comprehensive studies covering this aspect since the S went on sale. ------ dang Url changed from [http://techcrunch.com/2014/06/06/tesla-wins-back-the- right-t...](http://techcrunch.com/2014/06/06/tesla-wins-back-the-right-to- sell-direct-to-consumers-in-new-jersey/), which points to [http://www.engadget.com/2014/06/06/tesla-resume-sales-new- je...](http://www.engadget.com/2014/06/06/tesla-resume-sales-new-jersey/), which points to this. ~~~ DiabloD3 The original techcrunch url didnt point to engadget when I submitted it, but thanks to the mod who changed it to the nj.gov url.
{ "pile_set_name": "HackerNews" }
Defeating Electron - felixrieseberg https://medium.com/@felixrieseberg/defeating-electron-e1464d075528 ====== siproprio This post has a lot of points that are true. But have you ever noticed how the only good app built using electron that people can point to is Visual Studio Code? ~~~ esP3FJhDD Discord? ~~~ Psyonic Also Slack
{ "pile_set_name": "HackerNews" }
"If the news is that important, it will find me" - parker http://www.mathewingram.com/work/2008/03/27/if-the-news-is-important-it-will-find-me/ ====== BrandonM I thought this was going to be a post about how reading news sites is a waste of time. That said, I'm not sure this is the best viewpoint to have. It fails when everyone holds this viewpoint, because if everyone expects the news to find him, there will be no news. Someone has to actually be out there seeing what is going on and reporting on it. Another problem is that if you've ever played telephone (<http://en.wikipedia.org/wiki/Chinese_whispers>), you know how information can get distorted as it gets passed along. A non-trivial issue I've seen in this US election is that non-news-seeking individuals hear from someone that Obama is Muslim (never mind that they also have incorrect notions of what being a Muslim means based on various pieces of "information" that has "found them"), and that affects their decision (note that I'm not actually an Obama supporter). In other words, I think the post promotes something that may turn out to be a dangerous practice which could eventually result in an echo chamber of news which fails to reflect many of the things that are actually happening. It takes true investigative reporting to uncover real conspiracies and dangerous secrets. Would anyone who believes, "If the news is that important, it will find me," be willing to do that? ~~~ dreish I wouldn't call your first point a problem. There are plenty of voracious newshounds out there that are collectively scraping all available news sites clean and originating these link posts and chain emails. That's why the phenomenon exists. As for the echo chamber, this problem is nothing new. Traditional media copy ideas from each other with less self-examination than the blogosphere, which is unabashedly divided into two camps carefully scrutinizing each other for any inaccuracy to pounce on, and they (Reuters and AP especially) frequently screw up stories that are outside the core competencies of the vast majority of journalists, like science. It's the responsibility of Obama and his supporters to dispel misconceptions about him, as it ever was, and I'm sure he and they know full well how to do it. ------ Tichy I wish, but in reality, I don't know many people with the same interests as me. I am sure that if anything important to anyone would happen, I would probably learn about it (like World War 3 starting or something), but other than that? Example off the top of my head: I don't think anybody I know would have told me about the Netflix Challenge yet. ~~~ kingnothing Are you working on the Netflix Challenge? If not, why is it actually important that you know about it? After I started asking myself questions like that, the amount of news I "needed" on a daily basis significantly decreased. ~~~ Tichy Toying around with it - certainly not expecting to win any money. Sure, "need" can be defined in all sorts of ways, not only with respect to news. Perhaps it would be most efficient to live like a reclusive monk? ~~~ kingnothing Well, I found that I was spending hours a day on reddit, slashdot, and N.YC before I realized that most of the "news" I was getting had no direct or indirect impact on my life. Now, I mostly just read the headlines on the other sites. This is the only one that I often read articles on, much less comment about. It was a huge time sink for me to be concerned about things that don't matter. :)
{ "pile_set_name": "HackerNews" }
Develop Now with ClojureScript for React Native - mfikes http://blog.fikesfarm.com/posts/2015-07-19-develop-now-with-clojurescript-for-react-native.html ====== hellofunk I just want to say..... this is really, really beautiful stuff. ~~~ mfikes Thanks :) This stuff is coming together fast now. Android should be available soon as well. :) ------ olivergeorge Love that workflow demo! [https://www.youtube.com/watch?v=Ci4uviG8S0o](https://www.youtube.com/watch?v=Ci4uviG8S0o) ------ mfikes Author here. Willing to answer questions.
{ "pile_set_name": "HackerNews" }
China Unveils New Native Operating System - adamgibbons http://sinosphere.blogs.nytimes.com/2014/01/17/china-unveils-new-native-operating-system/?ref=world ====== voidr > It said existing open-source operating systems pose security risks And by security they probably mean ability to insert their own backdoors. ------ vectorsys Let me guess. Cisco won't be suing the Chinese government for it running Java.
{ "pile_set_name": "HackerNews" }
How the Secret TPP Agreement Will Affect You (And Companies Like Us) - kavehs1 https://www.sherbit.io/how-the-tpp-will-affect-you/ ====== paulhauggis So a secret net neutrality agreement is fine, but this is where the line is drawn? I should take all of the things people told me when I raised concerns about the net neutrality agreement and use it in discussions regarding this.
{ "pile_set_name": "HackerNews" }
Steve Jobs' View on Everyone Learning to Code - windy-topology https://twitter.com/LifeTechPsych/status/1293565797378465793 ====== willcate Not J O B apostrophe S. "Jobs' view." ~~~ windy-topology yikes. thanks for the catch
{ "pile_set_name": "HackerNews" }
3-Sweep: Extracting Editable Objects from a Single Photo [video] - rellik http://www.youtube.com/watch?v=Oie1ZXWceqM&hd=1 ====== mwsherman The key here is really complementary use of ‘what humans are good at’ and ‘what machines are good at’. In this case, it’s fair to say the machine, by analyzing pixels, can’t figure out perspective very well. The human can do that just fine, given an interface mechanism. The machine is good at detecting edges and seeing similarity between pixels. Given hints from the human that ‘this point is within an object’ and here is the perspective, the machine can infer the limits of the object based on edges/colors and project it into 3 dimensions. Amazing. ~~~ gohrt The perspective analysis is done pretty darn well by the machine in these examples. ------ olympus I'm not a HN etiquette stickler, and I'm not accusing anyone of any foul play, but the actual YouTube video was submitted 17 hours prior to this post: [https://news.ycombinator.com/item?id=6358080](https://news.ycombinator.com/item?id=6358080) This is just in case you want to throw a few upvotes their way for being first. This also illustrates that late night (PDT/UTC -8) posts don't get a whole lot of votes and proper timing is crucial to getting lots of votes. ~~~ turing It was also submitted here even earlier: [https://news.ycombinator.com/item?id=6351712](https://news.ycombinator.com/item?id=6351712) Personally, I'm just glad to see this video finally getting traction. It really is _such_ a cool demo. It even stands out in the field of consistently high-quality SIGGRAPH demos. Can't wait to read the paper! ~~~ spindritf I submitted it too [https://news.ycombinator.com/item?id=6352371](https://news.ycombinator.com/item?id=6352371) It's weird that it has received quite a few votes each time and never made it to the front page. Was it a timing issue (late night, early morning, non- American hours) or is YouTube "weighted down" somehow? ------ krisoft What I was thinking all along: "Oh come on! It can't be this perfect, show me where it fails." And they did! This is indeed magic. I'm so happy to live in this age, and be part of the "Sorcerers' Guild". ~~~ rellik Yeah, I was thinking the same thing. Funny how them pointing out the failures of the product make it seem cooler (since it seems more real). ------ DocSavage The paper is not out yet, but you can read the abstract here: [http://www.faculty.idc.ac.il/arik/site/3Sweep.asp](http://www.faculty.idc.ac.il/arik/site/3Sweep.asp) ------ breckinloggins If you marked shadows and associated them with their source, could you then recover the light source(s) and be able to remove the baked shadows and recast them in real time? Also, with the shiny objects, could you specify the material properties and have it "back out" the reflection such that the reflection was recomputed as you moved the shape around? ~~~ gohrt Yes, there are other projects that do things like insert a synthetic object into a scene, with natural in-context lighting that is inferred from the light gradients on other objects in scene. ~~~ op12op12 Yes, here's a cool demo video from 2011 SIGGRAPH Asia...can't even imagine how much more things have progressed since then: [http://vimeo.com/28962540](http://vimeo.com/28962540) ------ swamp40 WOW. Forget the Photoshop stuff, this needs to be integrated with 3D printing _immediately_. Spit out a design file into Tinkercad[1] for some minor adjustments and BAM, you've made a printable 3D model. [1] [https://tinkercad.com/](https://tinkercad.com/) ~~~ nicholassmith That's what I thought when I saw it. Break something, take a quick snap and import it, fix the damage, send to printer. Very little 3d modelling skill required, making it way more accessible to the average person. ------ moocowduckquack I want this + i❤sketch now, but unfortunately I suspect that jumping up and down and shouting isn't likely to help. [http://www.dgp.toronto.edu/~shbae/ilovesketch.htm](http://www.dgp.toronto.edu/~shbae/ilovesketch.htm) ~~~ MichailP Thanks for the reference. Are you aware of any (commercially available) handdrawing to CAD software? I need to do a bunch of relatively simple figures for my thesis, and it would be much easier if I could just use handdrawings and than transform them to professionaly looking figures. ~~~ danboarder Have you tried Google SketchUp? It might be the closest thing to this type of ease-of-use that I've seen so far, and it's free. ~~~ MichailP Thanks, will try it out, but wrom tutorial vids it looks more on the CAD side than handdrawing side. ------ alxbrun Wow, super impressive. And meanwhile, Silicon Valley is working on the gazillionth social photo sharing app. ~~~ baddox The other side of the argument is that social networks improve far more lives than academic research projects like this. ~~~ dredmorbius Citation needed. ~~~ baddox Is it not self-evident? How many people connect through social networks, which is an obvious benefit? How many people benefit from research papers about 3d model generation from photographs? ~~~ dredmorbius _Is it not self-evident?_ No, it's not. _How many people connect through social networks_ That's roughly quantifiable. FB has roughly 1.15 billion users, not sure of its daily use stats. Some numbers: [http://expandedramblings.com/index.php/resource-how-many- peo...](http://expandedramblings.com/index.php/resource-how-many-people-use- the-top-social-media/) _which is an obvious benefit_ Now _there_ is a questionable assumption. Given that increasing numbers of people are _leaving_ FB in saturated markets, and peak membership seems to top below 50% of the population, there seems to be a limit. And I could turn up studies showing negative effects of social networking / media saturation ranging from social isolation and depression to broken marriages and lost jobs to health and life-expectancy loss due to inactivity. _How many people benefit from research papers about 3d model generation from photographs?_ First: a false equivalence and shifting goalposts. Your initial claim was "most of the academic research". Secondly: academic research covers a huge range of areas, from improved health and diet to better machines and alternative energy sources to faster and more accurate computer algorithms. Third: what you see as a useless toy has some pretty evident applications that I can consider. Attach this method to a 3d CAD/CAM or printing system and you have manufacturing or parts replacement from a 2D photograph (AutoDesk has demonstrated similar modeling/capture systems but based on multiple images, but these can come from any camera). Art interpretation, archaeology, X-Ray modeling, geological imaging, and astronomical applications come to mind. There might be applications in protein folding or other microscopic imaging applications. And the beneficiaries of such technolgies could extend far beyond just those who are currently plugged in. Blindly claiming social media vastly exceeds the value of such research fails to pass the most casual of sniff tests. ~~~ baddox I don't think it's reasonable to question that assumption. Humans are social creatures, and social networks make it easier to connect with people over arbitrary distances. To deny that social networking is not beneficial is equivalent to arguing that telephones and postal services are not beneficial. Your analysis focuses only on Facebook. Of course people are leaving Facebook. But is the total user population of all social networking apps decreasing? I doubt it. > First: a false equivalence and shifting goalposts. Your initial claim was > "most of the academic research". Poor phrasing on my part. My original goalpost was "the academic research _like this_ ," which is admittedly vague. What I meant was research projects focused on image processing and interpretation. > Third: what you see as a useless toy has some pretty evident applications > that I can consider. I don't see it as a useless toy. I just think it's far less useful than social networking services, which have a very practical obvious benefit. > Blindly claiming social media vastly exceeds the value of such research > fails to pass the most casual of sniff tests. It's not a blind claim, it's what I feel is an extremely obvious claim. ~~~ dredmorbius _I don 't think it's reasonable to question that assumption_ It's reasonable to question _ALL_ assumptions. _Your analysis focuses only on Facebook._ No it doesn't. I pointed at FB as the largest of the present SNs, but referenced other SNs as well. FB is a leading exemplar of the field. My use of it isn't intended as exlusionary of other SNs. _My original goalpost was "the academic research like this,"_ Which largely moots the rest of the argument. Though as I pointed out, "research such as this" actually _does_ pose some reasonably interesting and useful applications. We can argue over those magnitudes, but I'll stick with my initial assessment that the net benefits of such research are likely to be high. Also, but narrowly identifying what you feel is and isn't valuable research, you're sharply skewing the results to your favor. It's as if I said "but I meant by 'social media' 4Chan and HotOrNot". _it 's what I feel is an extremely obvious claim._ And it's what I feel requires citation. Which you've failed to provide, being rather more inclined to engage in rhetoric. HAND. ------ Raphmedia This is sorcery! This technology is awesome. If it's as user friendly as they make it looks, I could see a lot of application for that! ~~~ breckinloggins One application I can see is teaching people how to model objects in 3d. You could use this as the 3-dimensional analog to tracing and have tutorials where you first get good at tracing the model and then try to recreate it from scratch. For example, I have only tried my hand at 3d modelling once or twice (and sucked at it enough to give up), but just watching this I feel like I could model vases and lamp posts with a bit of practice. ------ dharma1 most impressive thing for me about this demo is how good the shape detection is (seems way better than magnetic lasso in Photoshop), and how they brought different pieces of separate technologies together to such a fluid experience. And how the presenter sounds about 12. These guys/girls know what they're doing. ~~~ snogglethorpe > _seems way better than magnetic lasso in Photoshop_ Indeed, and it's very impressive work. It makes sense that this is the case, because this system is doing edge detection with fairly strict constraints: the edges must match the outline of a fairly simple shape which you roughly know the size and orientation of. That seems like it's inherently going to yield better results than completely- unconstrained edge-detection as in photoshop.... ------ martindale This is the single most impressive example of image processing I've seen to date. ~~~ acgourley I think they patchmatch algorithm they use to fill the background is cooler, to be honest. Check out their video: [https://vimeo.com/5024379#at=0](https://vimeo.com/5024379#at=0) ~~~ dharma1 nothing new though, photoshop has had content aware fill for a while ~~~ VikingCoder The video is 4 years old. ------ lsh This seems quite similar to this presented in 2011: [https://www.youtube.com/watch?v=hmzPWK6FVLo](https://www.youtube.com/watch?v=hmzPWK6FVLo) [http://www.kevinkarsch.com/publications/sa11.html](http://www.kevinkarsch.com/publications/sa11.html) ------ tbatchelli It looks so simple, yet my limited understanding of image processing tells me this requires a ton of research and technology. The pace of innovation is staggering! ------ atopuzov I wish I had the time to sit down and understand all the math and algorithms behind this. It's awesome. ------ jostmey I am skeptical, although I remain hopeful that my skepticism is misplaced. The "software" somehow seems to know what pattern of colors should exist on the other side of the object. Can someone explain to us how this aspect of the software works? ~~~ prezjordan Looks like the flip whatever is on the visible side. If you look at the underside of the telescope, it's just a repeated pattern of what was originally visible. ------ baddox Is there a reason many of these crazy image processing technologies never seem to have actual demos or releases? The only exception I can think of it the "smart erase" idea, which has been implemented in Photoshop as well as Gimp. ~~~ wahnfrieden What other image manipulation software do you follow closely? ~~~ baddox I don't follow any closely, I just remember seeing several tech demos similar to this. ------ snogglethorpe A lot of cool rendering/modeling research seems amazingly well-suited for the film industry and this is a perfect example ... besides the obvious applications in making CGI versions of real-world scenes, you can just imagine the director saying "oh no, that lamp is in the wrong location in all that footage... move it (without reshooting)!" I wonder if it's just a coincidence, or whether the mega-bucketloads of money the film industry throws at CGI are a major factor in funding related research even in academia? ~~~ bsenftner Not to imply that this technology is anything short of fantastic, if you look closely at the video again you will notice fairly obvious artifacts when an object is 'moved' from it's original location - the background replacement is only so good from a single image. Likewise, the 3D objects themselves created by this system show unrealistic artifacts. I'd like to see the results after they expand this system for multi-photo input, of the type used in film with multiple images from a moving camera. My point being, this is a fantastic combination of known technologies to create something truly new, and with refinement will be suitable for feature film work. However, as it is shown in the video, not high enough quality for VFX applications. (Disclaimer: VFX pipeline developer here.) ~~~ snogglethorpe > _However, as it is shown in the video, not high enough quality for VFX > applications_ Sure, understood. The thing is, I imagine film VFX guys are _already_ doing this kind of task—making 3D versions of real objects from the movie and doing CGI additions from them—and tools like this (with, as you say, refinements) could be a great help in speeding up that process... ------ zem this is one of the most impressive things i've seen in a while. ------ zxcvvcxz Question for the entrepreneurs: how would one monetize such a cool algorithm? I come across plenty of cool stuff like this, but without any idea how they can solve real problems. ~~~ bsenftner This tech, as is, is suitable for one to make models of most of their household furniture as well as the rooms of their house. Possible applications: 1) Virtual home makeover, 2) child's "play/doll house" is their own home (virtual or 3D printed)... and on and on and on... Note that this system does not handle irregular, organic shapes (people, plants), so those need a different solution. ------ jack-r-abbit Also awesome is that it handles the background replacement so well. This could also be used to just remove an ugly lamp post, telephone pole, etc from an otherwise good photo. (assuming you can remove objects and resave the image) Edit: I am aware that Photoshop has some of this available. I've not played with it so I don't know how they compare. ~~~ pwny If you're just removing part of the image after cutting around it with a tool like this, having the object interpreted as 3D isn't really going to be of any benefit. The impressive thing here, imho, is the seemingly effortless and seamless transition and replacement. The background is fixed and the surface texture is stretched in what seems like real time. ~~~ jack-r-abbit Yes... I know the 3D part is the more impressive part. But I was also impressed with its ability to back fill the background. ------ hazz This is amazing. My first thought is this could allow F1 teams to get a much better idea of what new packages their competitors are bringing to races early on just by looking at photos and video footage and modelling the new parts. ~~~ gohrt This doesn't tell you any more about the contents of the photo than what is already visible. It can't actually see what's behind an object, it just synthesizes a plausible fictional fill. ------ TullamoreDude This indeed is very impressing and I see the how much work passion is into this project.But I still have to say it almost only about round or cylindrical objects, there is still a long way to go ~~~ kunil He did a couple rectangle ones too. And it's cylindrical tool handles a lot more than cylinders. ------ voltagex_ Is it too much to hope that this tech will be implemented in a program that's within an "average" user's budget? (i.e. non-enterprise). ------ deadfall I think this is really impressive. Do you think it will be years before this actually gets used in public 3D modelling tools? I vote for this to be used with 3D printer ------ EGreg This is awesome - but how do they reconstruct the backgrounds that the objects previously obscured? There must be more photos? ~~~ dharma1 i thought about that too - i think the background is simply a mirror image of the foreground, and that the object 3d shape is symmetrical ~~~ EGreg apparently they are using some other algorithm to do this - even more impressive! however, it seems strange in the first example how mountain ranges appear where none were before... how did the algos know to put it there? ~~~ abrichr This is the PatchMatch algorithm: [http://gfx.cs.princeton.edu/pubs/Barnes_2009_PAR/](http://gfx.cs.princeton.edu/pubs/Barnes_2009_PAR/) ------ Beltiras This video is currently unavailable. Anyone else getting static@youtube? ------ pjgomez Simply astonishing... imho this technology is revolutionary. ------ agumonkey A worthy successor to SketchPad, beautiful user interface. ------ DavidPlumpton I read this as "from a single photon" ------ scoofy I'm going to need to buy more filament...
{ "pile_set_name": "HackerNews" }
Will Apple hunt down German WePad tablet over the "Pad" trademark? - dujkan http://www.geek.com/articles/chips/will-apple-hunt-down-german-wepad-tablet-over-the-pad-trademark-20100415/ I bet they'll sue... ====== Zak On one hand, the name seems pretty obviously derived from iPad. On the other hand, I'm writing this on a Thinkpad; people other than Apple have been using "pad" in computer names for quite some time now. ~~~ CWuestefeld As I understand trademark law, the fact that something is an obvious derivation isn't relevant (that would be a copyright issue). _Note: IANAL_ The purpose of trademark law is not to protect the _seller_ , but to protect the _buyer_ by preventing confusion. If there's potential that a reasonable buyer might see a "WePad" and think they're buying an Apple iPad product, then this would constitute a trademark violation. But if we don't think that a reasonable person would be misled, then there is no trademark violation. EDIT: spelling ~~~ ZeroGravitas That was the original intention but (and IANAL) I read something recently that said it had been extended so that if something damaged your brand, then you got to say whether it was acceptable or not. ~~~ CWuestefeld I've never heard this, so I'd be interested to see a citation. Even if true, I don't see how this could constitute damage to Apple's brand. The fact that Apple's name is used to help understand what a competing product is doesn't do any damage to the brand. It seems equivalent to putting on the box "compare our product to an iPad". ------ char I hope not. Trademarking common words such as 'pad' is ridiculous. It's not like they invented some awesome word, like 'cromulent' and don't want others using it. Before we know it, they'll be going after MaxiPad, too. ~~~ rkowalick Your use of cromulent has embiggened us all. ------ xtho Apple should rather add what is currently missing from the iPad instead of making fools out of themselves.
{ "pile_set_name": "HackerNews" }
Backing up elasticsearch indices with curator and minio - aboullaite https://aboullaite.me/elasticsearch-curator-minio/ ====== ggm AWS specific. Elastic is on gcp too y'know... ~~~ aboullaite This has nothing to do with AWS! I'm aware that elastic is available on other cloud providers :) this is an on premise alternative to S3. ~~~ ggm The scripts embed Amazon dependencies don't they? If I mis label this as AWS depending, can you explain how I use this to do a dump on a gcp hosted es instance? ~~~ aboullaite I think you just missed the purpose of the article! It simply explains how to backup your es indices on minions, which is a S3 alternative and that's why the use of s3 plugin! There are different plugins for different cloud providers that's above the scope of the post!
{ "pile_set_name": "HackerNews" }
Ask HN: Freelancers in the US: What do you do for health insurance? - patientfrog Currently getting health insurance for my wife and me through my employer and plan to transition into freelance&#x2F;consulting. I&#x27;ll have to start paying for insurance and was curious what other freelancers on HN are using, what monthly costs are like, etc.<p>I am especially curious what freelancers in the NYC area are doing.<p>Thanks! ====== Beached I just had to purchase health insurance two week ago in the state of MI(Just turned 26). I had 3 options: 1\. Obama Care - I qualified for a $16 subsidy... 2\. Individual plan from a health care provider in state 3\. Health Share plan (think Credit union for health care) Obama care cost 2x as much as an individual plan, and almost 3x as much as a health share plan, so even after the subsidy, this was not an option. Health share plan was very affordable, however it only worked with the health care providers on the east side of the state(I live on the west), it had a low yearly deductible, much lower then the individual High deductible HMO, however it had a 20% copay for all health services, including routine physicals and checkups. Individual Plan from an instate health care plan costs slightly more each month, but all routine check ups, generic brand pharmaceuticals proscribe from my doctor, and any specialist (Chiropractor, Knee Doctor, etc) visits are 100% covered IF I am referred by my primary care physician. I went with an individual plan for about $140/m and $28/m for dental coverage. A political side note: this same exact coverage would have only cost me $84/m before 2014 when Obama care went into play. (edit: Health Care cost, not Heath + Dental) ~~~ gwbas1c The cost went up because of the pre-existing condition situation. Pre Obama Care, if you had any kind of pre-existing condition, you wouldn't be able to buy the $84 coverage. (I got stuck having to get a normal job because of a pre-existing condition.) ~~~ Beached Yes, Since I do not have any pre-existing condition, and have never smoked, that $84 would have been mine. Because no insurer can decline someone, my cost went up to make insurance for people with per-existing conditions more affordable. ~~~ Aqueous The fact that you didn't have to buy insurance until 26 is a result of Obamacare too. Think of all the money you saved being on that insurance for several extra years. You're welcome! And maybe in part because you're spending that $84 now someone else who has a pre-existing condition and no health insurance will be able to not die or go bankrupt if they get something serious. You have to buy insurance because our system now factors in the strain on the system you would cause if something happened to you and you didn't have insurance. Just think of it like a tax (or the tax penalty like a tax, and the subsidy like a reward for responsible behavior.) That's how the Supreme Court thought of it. ~~~ Beached >The fact that you didn't have to buy insurance until 26 is a result of Obamacare too. Think of all the money you saved being on that insurance for several extra years. You're welcome! yes i am aware that the 26 rule is an obama care result. However, if I am nice, and I am, I pay my parents the extra cost per month to go from a spousal plan to a family plan, which is $85/m, shocker.... However if my parents had other kids younger then me that required a family plan it would have saved me money, yes. in my response to another post, i noted that this was really the only positive to Obama care that I experienced. >And maybe in part because you're spending that $84 someone else who has a pre-existing condition and no health insurance will be able to not die or go bankrupt if they get something serious. I dont think it is fair that I have to pay for someone elses health care expenses. ESPECIALLY since that someone else doesnt take the time of day to shop around for the best coverage for the cost, care to look at success rates to avoid re-works, or heaven for bid, self diagnose themselves and go directly to a specialist that can result in more expensive care. If people actually had to pay for the coverage they used, they would take the time to shop around and make informed decisions, and do things like exercise and eat well, avoid smoking and recreational drugs, then people with pre-existing conditions, and low incomes wouldnt have to pay so much for their care. Its called capitalism, and the health insurance industry would look VERY different. Obama care only gets us farther from where we need to be, and makes the entire situation worse. I know I sound like a jack ass right now because some people would suffer in the short term, but if you eliminated obama care, and the health insurance industry, we would be better off in 10 years. keep medicare for the poor, old, and disabled if you want, keeping it will have very little affect as its an income tax and not a premium subsidy. >You have to buy insurance because our system now factors in the strain on the system you would cause if something happened to you and you didn't have insurance. Just think of it like a tax (or the tax penalty like a tax, and the subsidy like a reward for responsible behavior.) That's how the Supreme Court thought of it. If the government allowed care providers to go after uninsured individuals assets and not eat the costs. AND if the government allowed back charging for care given to qualified medicare/medicaid individuals, there would be no need to increase the burden on me to help pay for everyone else who goes in for care un covered. over 75% of the uncovered costs pre-ACA are people treating the ER like their PCP anyways. If we allowed capitalism to work, those individuals who go to the ER for a cold would eventually go to a PCP and only pay $25 for the visit and $15 for the generic meds. ~~~ lentil_soup I'm sorry dude, but not every pre-codition is because people don't exercise or don't eat well. ------ shiro I became a freelance 13 years ago. On COBRA for a while. Looked for IEEE group insurance but HI wasn't covered then. The insurance company (HMSA) offered a switch to individual plan. It wasn't too bad initially ($350/mo for me and my wife, HMO, $500/$1000 ded.) But then the premium soared and also we had a kid; at peak it was $1050/mo without drug, vision and dental. I learned antibiotics are pretty expensive. After ACA, it got a lot better. Currently we have PPO through HMSA in HI. About $650/mo for me, my wife and one kid. Kind of high ded ($1500/$3000), but drugs are covered and also with basic vision and dental. ~~~ logfromblammo You should visit your local pet store, in the aquarium fish section, and check out the products available for maintaining the health and well-being of your fish. And you should also learn the generic drug names or IUPAC chemical names for the commonly prescribed antibiotics. I'm not saying those two suggestions are in any way related, other than the possibility that doing both could save you a great deal of money, in certain situations. Otherwise, grocery stores and big-box stores with embedded pharmacies often sell generic antibiotics at prices below your drug co-pay, but you would still need a prescription, and therefore an office visit co-pay. It pays to shop around. ~~~ welshguy I have an unhealthy fish. What brand would you recommend? ~~~ logfromblammo I can't recommend anything, as I am not a veterinarian. You need to do your own research. Fortunately, the web makes this rather easy. Try searching for "aquarium antibiotics" and refine your search terms as necessary. ~~~ b_t_s Fishflex FTW....for fish of course. It's even better than fishmox :P ------ asciimo The Freelancers Union now offers health insurance ([https://www.freelancersunion.org/benefits/](https://www.freelancersunion.org/benefits/)). NY was their pilot state. ~~~ colinbartlett I don't really understand why Freelancers Union is useful anymore. I thought the point of those "unions" were work-arounds to the old system where group rates were drastically cheaper. But with the Affordable Care Act, individual plans are no longer medically underwritten, so how are those groups helpful? ------ bradwschiller Oscar is great in the NYC / NJ area ([https://www.hioscar.com/](https://www.hioscar.com/)). They do a good job of lveraging technology and are less confusing than other insurance companies. They even have a program with Misfit where you get $1 per day for reaching your steps goal. Lowest premiums I've found. I'm paying less than $1,000 per month total for myself, my wife, and my kid. ~~~ colinbartlett Second vote for Oscar here. My wife and I pay north of $1,000 but have something of a Platinum plan and it covers _everything_. The Affordable Care Act has really taken the pain out of insurance when working for yourself. Things like Freelancers Union were almost a necessity before the ACA. Now individuals can buy insurance at competitive rates and without worrying about "pre existing conditions". ------ silentbob46 Like others have mentioned...if you're in NY, just go to [https://nystateofhealth.ny.gov/individual](https://nystateofhealth.ny.gov/individual) to research and enroll in an ACA-compliant individual health plan. Personally, I went with a plan from Oscar ([http://hioscar.com](http://hioscar.com)). Oscar offers reimbursements for gym memberships, free unlimited access to Teladoc ([http://www.teladoc.com/](http://www.teladoc.com/)), and Amazon gift cards if you take enough steps per day (tracked by a free Misfit Flash). Pre-2015, I had insurance with Freelancers Union ([https://www.freelancersunion.org/](https://www.freelancersunion.org/)). Their plans are not subsidized, but with some of them you get free, unlimited access to doctors at Freelancers Medical ([https://www.freelancersmedical.org/](https://www.freelancersmedical.org/)). They have two offices, one in Downtown Brooklyn, and one in the Financial District. ------ eibrahim Don't listen to politics in the comment. Here are some real life examples: I am married and have 2 kids. I got a gold plan through the marketplace for $1250 a month. I got no subsidies since I make more than is required. It's a pretty good plan but expensive. My last job which was pre-obamacare they were deducting $450 a month every 2 weeks for just my wife and I. So nothing really changed for me. On a sidenote my cousin is low income and she got a killer plan through obamacare for 75 a month. So now she can afford to go to the doctor and not "walk it off" :-) ~~~ UnoriginalGuy Can you clarify this: > deducting $450 a month every 2 weeks That is confusing/doesn't make sense. ~~~ potatosareok $450 a deducted per paycheck. Paycheck every two weeks? ------ midnightmonster I am male, early thirties, married with children. We have a high-end ACA- compliant plan purchased directly from Florida Blue. At almost $1600/month, it's really freakin' expensive. But we have various health issues between us that make us uninsurable for any reasonable amount via individual plans, and that also mean we use a lot of health care. Prior to ACA, we had my wife and kids on a COBRA plan out of Maryland from when we lived there (God bless Maryland for making COBRA last as long as you want, and making it last even when you leave the state) and me on a separate individual policy. Together they were about as expensive as our current plan, and they had higher deductibles/OOP max and poorer coverage. Still grateful for the Maryland coverage, though, as I would have had to have a normal job once we moved to FL otherwise. ~~~ colinbartlett This is very interesting to me. It was my understanding that the rates were no longer jacked up, even if you had "various health issues". This is called Medical Underwriting and I thought it was forbidden by the ACA. It could be a state-by-state issue. Perhaps Florida is different? When you applied for insurance, did they acquire your medical history? 1\. [http://kff.org/health-reform/perspective/how-buying- insuranc...](http://kff.org/health-reform/perspective/how-buying-insurance- will-change-under-obamacare/) 2\. [http://blog.rmhp.org/2013/12/sticker-shock-obamacare-and- the...](http://blog.rmhp.org/2013/12/sticker-shock-obamacare-and-the-death-of- medical-underwriting/) ~~~ midnightmonster I implied a slightly scrambled chronology. To clarify, we kept the expensive COBRA coverage from my wife's old job in MD, because prior to ACA there were no good individual insurance options for her and the kids. Now, with ACA, we have an individual plan purchased from Florida Blue. It is very expensive, but not much more so than the two separate plans we had, and the better coverage, less confusion from the insurance being out of state, and having everyone on the same deductible and OOP max make it worthwhile. ------ mgkimsal We've got precious few choices in NC - our county has had only one insurance company providing ACA-compliant insurance for the last few years, and premiums have basically doubled in 3 years ($275/month to $530/month). It'll go up again next year no doubt. This is with the highest deductible available as well. Some areas of the country seem to have been better served with the passage of ACA, but much of NC seems in the dark ages. I _think_ we've got one other company that's now serving our county this year, and they're ... 20% more expensive, IIRC. ~~~ tanglesome That's because the NC state government refused to have anything to do with ACA. ------ zrail PPO through HAP in Michigan. $798/mo, $1500/$3000 deductible for my wife and I, no kids. I dug around on the exchange and then ultimately bought directly instead of through the exchange because we're not eligible for subsidies. ACA has basically made this a non-issue. Figure out how big of a deductible you can absorb, walk through your state's exchange and find something reasonable. Then (this is the most important part), raise your rate to compensate for your newly legitimate business expense. ------ dangrossman Shopped at the healthcare.gov federal exchange, now paying $223/month for a PPO plan with Aetna. Large network, no PCP election required, no problems with it the past 2 years. Don't forget that you can deduct insurance premiums on your taxes when you're self-employed. ------ dragons I'm in the Boston area, YMMV. I left my job in 2011. I did a few months of COBRA, which I found unreasonably expensive. I'm currently paying about $380/month on a BCBS individual plan with a $6K deductible (costs have risen about $5-$10/mo per year since I joined). I don't think my experience will help you much (single), but I'm putting this out there for others who might be interested. BTW I have never met the deductible and usually spend less than $1K/year on medical expenses. Knock on wood. PS I haven't noticed much effect due to ACA, except that some services that used to require a co-pay became free. It will be nice not having to worry about pre-existing conditions, too. ~~~ eropple $380/month sounds kind of high. I have similar coverage via MassHealth for $225/month (+$25 for dental). YMMV, of course. ------ kaypro Did this 3 years ago when I broke out on my own. I'm in Boston. MA has a decent portal through their MA Health Connector site. For my family (Wife, son and myself) I pay $760 on a high deductible plan for health and $99 for dental for my wife and me. Once my son needs dental it'll most likely go up by about $40 for the dental. My only advice is to really sit down and determine what you're going to need for coverage and depending on that choose a high or low deductible. We started out on a low and realized we really would be better off on a high one so switched at the next enrollment. Good luck. ~~~ slickwilli Is that per month? ~~~ kaypro Yes per month. ------ explorigin Expect to pay more (relative to group plans). If you want to see what plans are comparable without entering all your data into healthcare.gov, try [https://www.healthsherpa.com/](https://www.healthsherpa.com/) ------ batoure Yeah thanks to the ACA this is really not a problem any more... shop around and buy a plan... ------ jwolgamott Step 1: know your medical expenses. Your total cost per year is either a) You are healthy are it is the sum of your premiums or b) You (or a family member) is not and it becomes the sum of premium payments + deductible. In ObamaCare plans, this is basically the OutOfPocketMaximum. I am firmly in the (b) camp, and see yearly medical expenses as the deductible plus premiums. And so, I pay $1500 a month in health insurance premiums, but have a $1500 deductible and $4000 out of pocket max. Yearly cost: $22,000. So, depending on your health and tolerance for rick, go for a high deductible cheap premium plan OR a high premium low deductible plan. As a freelancer, premiums CAN be tax deductible (see a CPA to make sure). ------ netman21 Blue Cross Blue Shield of Michigan. When we started out it only cost about $500/month. Jumped to $890 and deductible went to $23K with the affordable healthcare act. Basically, now we pay all of our health care out of pocket. ~~~ zrail That's atrocious. We're in Michigan and only pay $798/mo for a $1,500 individual /$3,000 family PPO through HAP for my wife and I. Individual deductible for ACA plans is limited to $6,600 individual and $13,200 family, so you may want to shop around for a new plan during open enrollment this year. ~~~ Beached The deductible cap is only applicable to plans purchased through a health care exchange. He is likely on a family plan and purchased his insurance individually. similar ACA coverage would certainly carry a lower yearly deductible, however would likely double his monthly premiums. This was the case for me on my individual plan. If he has a healthy family who does not need coverage outside of basic physicals, then its likely he purchased this plan to only cover catastrophic events, I know I would rather purchase the $500/m with 23k deductible rather then the $1000/m 13k deductible if I only planned on using coverage in the event of a horrible catastrophe. ------ gervase Jesus, I thought we had it bad, but some of these other comments are making me rethink that. We're paying ~$640/mo for a $0 deductible, 0% coinsurance ACA Gold HMO, purchased off CoveredCalifornia. That's without any subsidies - like most on HN, we're just not eligible. That's covers me, my wife, and my infant daughter, and includes maternity coverage. It's still about double what we were paying before the ACA, but it's a far cry from the $1500+ premiums some others here are paying. ------ schlagetown I'm in NYC, recently turned 26 and signed up for the cheapest Oscar plan available. It's only around $180 a month — fairly high deductible (~$6,000) but includes free preventative care and a couple free primary care visits per year. I believe this "catastrophic plan" is only available for people under 30 yrs old, otherwise the cheapest is more like $300. Also doesn't cover prescriptions, but I don't have any so it seemed the best option for me right now. ------ bmj It's been awhile since I've looked into it, but my local tech council ([http://pghtech.org](http://pghtech.org)) offers a coverage group to provide health insurance options to member organizations. I knew a few folks that have leveraged this in the past, but I'm not sure how rates have changed due to the ACA. It may be worth seeing if such a program exists in NYC, and if it would be worth the membership costs. ------ codegeek I did exactly this as a freelancer even though not anymore. It will cost you more out of pocket to buy insurance compared to a "group" plan through an employer. My first suggestion: Keep the health insurance through your wife if you can and if she is still working. It is hard to beat the plans sponsored by employers as they get subsidized group rates to offer to employees. But if you absolutely need to buy your own, you have 2 options: Option 1: Use Obamacare [0] and see your options. You can try healthsherpa.com [1] which is a unofficial wrapper on top of obamacare and you can compare the various plans. Option 2: You go and buy health insurance directly from an insurance company without the extra layer of obamacare in between. You can use sites like ehealthinsurance [2] to get some quotes. Option 3: Use an insurance broker. Find someone locally in your area. Sometimes brokers can get you good deals. All options have benefits and problems. I personally hated obamacare as it was too much bureaucratic crap to deal with and now you have 2 layers to work with. The only advantage of obamacare is that if you are considered poor by obamacare standards, you can get subsidy on your premiums if you enroll through obamacare. But if you don't care about these things or are not applicable to you, then just go buy insurance directly and not even bother about obamacare. Monthly Costs depend on a few factors: In-network vs Out-Network: Very important factor. You can only go to certain doctors/hospitals etc that are "in-network". Some plans only allow in-network. Some plans have both but have higher premiums. Also, out of network coverage is very limited usually. Co-Payment: This is the amount (usually $10-$30) that you will pay for _every_ visit to a doctor. Some plans have no co-payment while most have the range as I mentioned. Deductible: This is the amount that you will pay _first_ for any medical expenses _before_ your insurance company pays anything. So if you go for a plan with "high deductible", then your premiums may be lower and so on.. I will say that for a family specially with kids, I personally prefer zero deductible as it can save you more over a year since kids visit doctors frequently. But if you think you won't visit the doctors as much in a year, then go for high deductible. Again, just a choice and no right answer here. Co-Insurance: This is the portion that you will pay _after_ your insurance company has paid the remaining portion. For example, if your co-insurance is 30%, then the insurance company will pay the 70% for the medical expenses and you take care of the rest. Again, to get here, you will have met your deductible first. Out of Pocket limit: This is the _total_ amount you may pay for an entire year. Anything over this, the insurance company pays regardless. For example, lets say your deductible was $500, co-insurance 30% and out of pocket limit is $5000 for the family and you end up with a bill of 14,000 on your very first visit during a calendar year. In that case, you will pay upto the total of $5000 (including deductible+co-insurance) and insurance company will pay the remaining balance of $9000. After that, you will not pay anything for that whole year (except copays). Plans with higher out of pocket limit may have lower premiums by logic. PCP (Primary Care Physician) required: THis may not affect cost but important factor to know. Some plans require you to choose a PCP and only use that PCP as your well, PCP. You have to let the comnpany know if you change PCP. Specialist Referral required: Some plans require you to get referral from your PCP before you can visit a Specialist. This is critical as you cannot go to a specialist on your own in that case. Hope this helps. Happy to give you more inputs if you need. [0] [https://www.healthcare.gov](https://www.healthcare.gov) [1] [http://www.healthsherpa.com](http://www.healthsherpa.com) [2] [https://www.ehealthinsurance.com](https://www.ehealthinsurance.com) ~~~ zrail > The only advantage of obamacare is that if you cannot buy insurance yourself > that easily with things like pre-existing conditions etc. Medical underwriting is no longer a thing industry-wide and hasn't been since the beginning of 2014. No matter where you buy your plan, they will not ask about pre-existing conditions. It's true that the only reason why you'd actually buy through an exchange is to get the subsidies, even if you don't it's worthwhile to check. They offer a pretty comprehensive survey of the companies that offer insurance in your state and roughly how much you'll pay, and you can go to those companies separately and check into their plans they don't offer on the exchange. Another idea is to find a local insurance broker in your area. They're free to you (the insurance companies pay them) and they'll find you a good plan that fits your needs. ~~~ marincounty "The only advantage of obamacare is that if you cannot buy insurance yourself that easily with things like pre-existing conditions etc." It has vastly improved the health care the poor receive. I'm not a huge fan of Obamacare, but it is all he could get past the Rebublicans at the time. He tried to push for a sensable solution, but Republicans fought it. I am waiting for anyone to propose a better solution to Obamacare, but keep bill's core requirements. As to what how for-profit insurance companies have explioted us, while blaming Obamacare; I hope there's a special place in hell for these heathens! In the original Obamacare bill there were measures that would limit rate increases, and out of pocket fees--the Republicans got rid of all of them. I recall them saying, 'Get ride of this language/requirement and we might pass it?" What I am trying to say is get rid of Obamacare, but replace it with something better. I haven't heard any real alternative plans proposed by the Rebublicans? I though the the Rebublican Doctor(Bobby Jindal) would have a thoughtful, pragmatic plan--yea, he has a plan, but it just sounds like basically going back to the free market system we had before? That worked so well? ~~~ nsxwolf I don't think what we had before could really be described as a "free market system". It was an employer-based healthcare system that shut individual buyers out of the only good deals, enacted by the government via the tax code. The new system is the same, except now there's no pre-existing conditions and the individual is forced to buy a plan. The plans themselves are the same crappy plans with a few tweaks. The consumer of healthcare is still completely disconnected from the price of the goods sold, completely screwing up market signals and making sure prices stay astronomically high. This is not a free market by any measure. ------ tanglesome I set up a Type C corp for my business. With it, the company can pay for all my employees (just me and my wife) medical expenses. For health insurance, in NC, I pay out the nose: $1,200 a month. It's a good, but not great, Blue Cross plan. In NC, that's really the only company that will look at you. ~~~ Retric 600$/month pretax is not that bad, but I suspect you can do better. [https://www.ehealthinsurance.com/](https://www.ehealthinsurance.com/) has lots of plans starting at $264.45 /month for a 35 year old. Granted the deductibles look high, but unless you have ongoing heath issues your generally better off using a higher deductible and putting the difference into a savings account. PS: Don't forget the high ROI part of health insurance is simple negotiating power, the risk mitigation part is less useful. ------ ojiikun I elect to have none, and pay a yearly penalty to the federal government as a result. I earn and save money on my own to cover medical expenses. I view payments to a 3rd party who then bank their business model on my _not_ needing payouts as a form of gambling on human welfare, which I deem immoral. ~~~ peacemaker While I agree the whole system is immoral, I have to ask what will you do if you have a critical injury or disease? Lots of things can happen outside of your control and healthcare is an insurance against going bankrupt. ------ rch I haven't used them myself, but I've always been curious about the group plans available to IEEE members: [http://www.ieeeinsurance.com/us/home.aspx](http://www.ieeeinsurance.com/us/home.aspx) ------ dnm When I was contracting (pre-Obamacare), I joined my local Chamber of Commerce and that made me eligible for their group plan with the local BCBS (family plan). Started with a PPO. Eventually downgraded to an HMO due to costs. ------ hellofunk When I was in NYC I joined the Freelancer's Union which provided health insurance. That was a few years ago, but it was quite helpful. ------ orasis Obamacare/BCBS is working great for us. ------ daxfohl Ask HN: What is the best subject to invite trolls rather than subject matter experts into the conversation? ------ jedanbik I pay $228/month for a personal silver plan down here in NC through the marketplace. ------ bsdadmin I went to a health insurance broker who helped find the best plan for IRS section 105. ------ x5n1 live in canada. ~~~ smutticus Seriously, as a Dutch/American dual citize I'm debating moving back to the NL simply for this reason. From what I can tell I would pay close to a quarter in NL for better insurance than in the USA. The US is a complete ripoff. ------ rwhitman Self employed health insurance in New York is ludicrously expensive, or at least it was prior to Obamacare. I've been on my wife's employer's plan but before that, I used Freelancer's Union which is NY only. I'm a big fan of their organization and everything they offer. The health insurance itself is a group plan and more or less provides what an employer's plan would. They also offer other types of benefits you'd expect from an employer.
{ "pile_set_name": "HackerNews" }
Scotland to use 100% renewables on time to host 2020 climate summit - solarengineer https://reneweconomy.com.au/scotland-to-reach-100-renewables-in-time-to-host-2020-climate-summit-60854 ====== mcjiggerlog This is somewhat misleading - it appears to be offsetting energy exported when there's particularly good conditions for renewables against times when there's not. In reality they are still both burning some fossil fuels and importing energy from countries that are burning fossil fuels. What really should matter is the carbon intensity of both electricity production and consumption. In terms of hard numbers for Scotland, their carbon intensity is 59 gCO2/KWh ([https://www.carbonintensity.org.uk/](https://www.carbonintensity.org.uk/)), which is actually pretty great. You can compare against other countries here - [https://www.electricitymap.org](https://www.electricitymap.org). Here's the current breakdown of their electricity production: Wind: 36.5% Nuclear: 31.1% Hydro: 16.7% Gas: 13.5% Biomass: 1.4% So right now they are 52.2% renewable energy. Presuming they are running renewables to capacity, this low carbon intensity figure is actually being kept low due to the fact they are sourcing a third of their load from nuclear. Should the Scottish government phase out nuclear as they are currently planning ([https://www.gov.scot/policies/nuclear- energy/](https://www.gov.scot/policies/nuclear-energy/)), then we can expect this carbon intensity to rise as gas is used to pick up the base load when conditions for renewables aren't favourable. ~~~ gtk40 I can't figure out which numbers you added to get 52.2% renewable. I would say wind, hydro, and biomass are renewable, with the possible addition of nuclear. Even just wind and hydro add up to (slightly) more. ~~~ mcjiggerlog It was a typo, it was 53.2% for wind + hydro. Biomass I refuse to count as it is absolutely awful for the environment. Nuclear is not technically renewable either. Regardless, I agree that getting fixated on the "renewable" label is not particularly productive. What matters is "zero-carbon". ~~~ ZeroGravitas Biomass can cover a range of technologies, some of which are greener than others e.g. if you capture methane from landfills or anaerobic digestion of organic waste then the absolute best thing to do would be to turn it into chemical feedstocks, but burning it for energy is definitely better than just letting it rot and escape into the atmosphere while also burning using fossil methane gas ------ spodek Excellent resource on this topic: _Sustainability Without the Hot Air_ by David MacKay, a Caltech-trained Oxford physics professor. Free download: [http://www.withouthotair.com/Contents.html](http://www.withouthotair.com/Contents.html). In the article on Scotland, however, not one word on the most safe, economical, immediately available, and effective technique: reducing consumption. Trying to reach sustainability without lowering consumption feels like trying to make a company profitable without considering lowering costs. I haven't visited Scotland, but most Americans could reduce their energy consumption 75% or more purely improving their lives before any challenging decisions. \--- A short description by MacKay of his book: "Sustainable Energy - without the hot air" presents the numbers that are needed to answer these questions: \- How huge are Britain's renewable resources, compared with our current energy consumption? \- How big do renewable energy facilities have to be, to make a significant contribution? \- How big would our energy consumption be if we adopted strong efficiency measures? \- Which efficiency measures offer big savings, and which offer only 5 or 10%? \- Do new much-hyped technologies such as hydrogen or electric cars reduce energy consumption, or do they actually make our energy problem worse? Wherever possible, I answer these questions from first principles. ~~~ atoav While I wholeheartedly agree with the sentiment of reducing energy consumption where it can be done I don't see why we shouldn't do both? And ofc there is always the danger of things like electric vehicles etc. becoming an case of indulgence: people paying to atone their sins. Consumption and shiny new things alone won't solve the climate emergency. For the UK the first thing that comes to my mind when thinking about inefficency is heating. I grew up in the alps and we usually have quite well insulated houses (incentivized by government). Using solar and heat exchangers is increasingly common etc. When I had been to England first, I was shocked at the building standards with these thin walls that barely keep the heat inside. On top they seem to heat a lot with electricity — something which you only would do in an emergency in the alps. If you can quickly save energy by modernizing heating and insulation go for it and expand renewables as well. ~~~ lol768 >On top they seem to heat a lot with electricity — something which you only would do in an emergency in the alps. How else would you propose heating a property in England in the winter? ~~~ iso947 Gas, which in my experience is far more prevalent in the UK than electric heating. ~~~ lol768 Not in new properties. I've not had gas heating in a property in the last four years or so. Natural gas is non-renewable, too. ~~~ iso947 Bought my new property in 2016, it’s GCH. Only comes on in deep winter, and that’s with a thermostat at 21C 24/7 ------ viburnum This is great news but I wish people would start talking about total energy and not just electric generation. They’re still using a ton of gas and oil for heating and transportation. It would be good to be honest about this. ~~~ hinkley I was kinda shocked to learn that long-haul natural gas pipelines are more efficient than long-haul power lines. They also tend not to fail during ice storms. The Greater Seattle Area has had whole areas without power for days or weeks within recent memory. It's not just an issue for old or rural areas. Putting all of your eggs into the electricity basket can get people killed. We should work on addressing these problems (or accepting energy diversity) sooner rather than later. ~~~ qqqwerty I am not quite sure I understand your point. Renewable electricity has the potential to be the most resilient form of energy we have ever had. You can already generate enough electricity for a typical home by putting solar panels on the roof. The main issue is storage but that is mostly an issue of cost not technology. For about 10-15k you can get enough storage to get you through a multi-day outage. And for double that you could probably go off-grid entirely. ~~~ jakehop The projected battery production in 2080 will only cover about 0,01% of Europe’s electricity demand. Batteries are not scalable and hydroelectric dams have their own geographic requirements which can’t be met in flat countries like The Netherlands or Denmark. ~~~ ben_w That’s a strange projection. Given that you only need to store energy for about a day[0] and that the batteries last at least three years when you use them like this, world battery production is already 1% of the storage requirements for worldwide 100% renewable. One forecast I’ve seen estimates that battery production will go up by a factor of four by by 2028. [0] longer periods with no wind and no sun are dealt with more cost- efficiently with long distance power lines than with storage. ------ Camas >Scotland [...] has a goal to source the equivalent of 100% of its electricity demand from renewable energy sources by the end of this year. What does 'equivalent' mean in this sentence? ~~~ est31 Scotland does not have an independent grid. There is an EU wide market to trade electric energy between countries. Some days they export elecric energy, other days they import it. Even if 100% of their local production is renewable, the energy they import is probably (partly) not. So I guess the 'equivalent' means that summed up over the entire year, they export at least the same amount of energy that they import. You can check this site for the present state (it doesn't show scotland though): [https://www.electricitymap.org/](https://www.electricitymap.org/) ~~~ toomuchtodo Might change with Brexit (re independent grid). ~~~ Rexxar They have nothing to win to do this. ~~~ toomuchtodo I’m not arguing the rationality, only the possible outcome. Texas has its own grid (ERCOT) because Texas. Humans are quirky. ------ nikodunk Speaking of selling/buying energy equivalents, I’ve recently become addicted to great real-time data feeds like [https://caiso.com](https://caiso.com) for California’s grid and Chrome/Firefox plugins to watch the data like [https://twitter.com/EnergyLollipop/status/122120582076489728...](https://twitter.com/EnergyLollipop/status/1221205820764897281?s=20) ~~~ toomuchtodo Check out [https://www.electricitymap.org/?page=map&solar=false&remote=...](https://www.electricitymap.org/?page=map&solar=false&remote=true&wind=false) It scrapes any public grid/ISO operator’s site for live and historical electrical generation mix data. ~~~ nikodunk Wow!! Hadn’t heard about it. Tmorow is also great for personal carbon tracking - trying them out as we speak. ~~~ toomuchtodo They are great folks, hope you enjoy their products! ElectricityMap is open source; please consider politely poking grid operators not yet providing data for grayed out areas so that the map fills out! ------ Ididntdothis When I was there I thought to myself that their wind could probably power the whole world. I have never been to a place with that much wind. ~~~ hinkley Did you see that Discovery Channel show where they documented the building of a giant oil platform for oil extraction in the North Sea? All of the things (and forces) they had to deal with... wow. The North Sea has a -lot- of energy in it, and climate change will probably accentuate that. ~~~ dasanman Yeah this is also why the west coast of Denmark is full of wind mills ------ littlestymaar > has a goal to source the _equivalent_ of 100% of its electricity demand from > renewable energy sources by the end of this year. > production _outstripping_ demand on 20 out of 30 days and over the whole > month providing _109 per cent_ of electricity demand. Emphasis mine. This is a great summary of what really happens if you attempt to go full renewable on wind power: you don't control the supply so you need a foreign market to buy your excess production on windy days and to provide power the other days. And no, battery isn't really an option because you can have (and usually have, in anticyclonic conditions) weeks with low wind, and you won't get enough batteries to supply your country for one or more weeks. (I did a calculation for France bases on real power data 2 years ago, and it needed about 4 Tesla Model S battery per inhabitant). ~~~ filmor Wind turbines are trivial to downregulate. The foreign market just allows you to shift the strike price for downregulation, making the generation in total more cost-efficient. ~~~ littlestymaar Technically you can, but not financially: the advertised energy cost of wind power relies on the fact that turbines produce as much as they can, and in most European countries there has been contracts between windfarms owners and governments to guarantee a buying price, decorelated from the spot market price. ~~~ thawaway1837 It may be marketed that way in press releases, etc (although I don’t even think that’s true). But I bet you anything that the people investing in these projects, and the companies pricing their power and betting their future in them, are not doing it on the basis of selling every unit of energy generated. ~~~ littlestymaar > But I bet you anything that the people investing in these projects, and the > companies pricing their power and betting their future in them, are not > doing it on the basis of selling every unit of energy generated. Wind power has been subsidized a lot in Europe during the past decade, so for at least 80% of the projects, not only the investors cared about selling every unit being generated, but they even relied on them being sold at 3 times the market price! ------ blazespin Really great of Scotland. Clearly the adults in the room. Unfortunately, the room is filled with 99.9% children and likely the only way to go at this point is to start seriously looking at carbon capture and storage. Sad, but time to face facts. The children are children and no amount of complaining has or will change that fact. Well, either that or find ways to enjoy a much warmer earth. ------ mistrial9 huge achievement -- no equivocation -- congrats ------ anonsivalley652 Scotland is also coal power-free. ------ mmhsieh the country that invented haggis is a natural leader in sustainable development ------ goatinaboat But aren’t 10s of 1000 of people flying in from all over the world for this “summit”? Let’s talk about renewables for powering videoconferencing, otherwise this is just a band aid. ~~~ thawaway1837 I’m pretty sure a few percentage decrease in Scotland’s fossil fuel usage will make up for all those flights in a few weeks, never mind all the other countries that may be driven to do the same based on the outcome of the summit.
{ "pile_set_name": "HackerNews" }
ISP punishes Elsevier for forcing it to block Sci-Hub by also blocking Elsevier - lainon https://boingboing.net/2018/11/03/balkanizing-the-balkanizers.html ====== y7 Dupe: [https://news.ycombinator.com/item?id=18370446](https://news.ycombinator.com/item?id=18370446)
{ "pile_set_name": "HackerNews" }
The Sad, Beautiful Fact That We're All Going To Miss Almost Everything - misham http://www.npr.org/blogs/monkeysee/2011/04/19/135508305/the-sad-beautiful-fact-that-were-all-going-to-miss-almost-everything ====== misham I'm curious what the community thinks about this especially as entrepreneurs and developers/designers. I spend a huge portion of my time learning new languages, tools and technologies that fitting in fiction/non-fiction books that do not directly relate to building a business or learning a new technology is extremely difficult. (e.g. I've been reading a US history book for a couple of months now) I'm sure people on HN have interests in a large variety of topics, what do you do to cover all of those topics? Or __do__ you cover all of those interests at all? Do you just pick a couple of interests and stick to them or do you dabble in many different ones and only dive in deeper if you get "bitten"?
{ "pile_set_name": "HackerNews" }
The Blockchain Is the Internet of Money - petethomas https://www.wsj.com/articles/the-blockchain-is-the-internet-of-money-1506119424 ====== geraldbauer Hello, I've started Awesome Blockchains [1]. The idea is to collect samples (and recipes) on how to build your own blockchains (from scratch) in JavaScript, Ruby, and friends. [1] [https://github.com/openblockchains/awesome- blockchains](https://github.com/openblockchains/awesome-blockchains) ------ nemoniac Two contrasting headlines side by side on HN today. The WSJ says Blockchain is the Internet of Money The Economist says Bitcoin is fiat. [https://news.ycombinator.com/item?id=15315813](https://news.ycombinator.com/item?id=15315813) ~~~ decentralised They are two different things. A Blockchain is the internet of money but some tokens and crypto-currencies are not backed by anything else than trust which could mean they could be seen as fiat currencies. The reason why the Economist is wrong is that the writer doesn't understand how mining and block issuance work so he doesn't see a reward token as a proof of work. PoW is the exact opposite of fiat because you don't have to trust the miner, you can check that he's done actual work to get that coin issued to him. ------ _coldfire To bypass paywall, no facebook account needed, drop into console or a bookmark (HN formatting weird, zerobin text there anyway): javascript:window.location="[https://m.facebook.com/l.php?u="+encodeURIComponent(window.l...](https://m.facebook.com/l.php?u="+encodeURIComponent\(window.location.href\);) alternative: [https://zerobin.net/?9b41630b08e38f96#FAC6oUt+oqX9fwxhamtom+...](https://zerobin.net/?9b41630b08e38f96#FAC6oUt+oqX9fwxhamtom+/WyM7PkDrdCU+wkCUcntM=) The bitcoin blockchain is a crappy database that's incredibly secure because it's backed by an amazing amount of irreversible thermodynamic energy. Anything you put into it is there forever, even quantum computing couldn't undo anything a few hours old. That's all the public needs to know. These articles are a bit tiring, despite being an aficionado somewhat keen for a blockchain winter. ~~~ sordidasset What's a blockchain winter? ~~~ pmcjones The analogy is to nuclear winter (life after a hypothetical nuclear world war) or AI winter (after the fall of the Soviet Union led to decreased military- funded spending on AI). ------ nnfy Bold title.
{ "pile_set_name": "HackerNews" }
No, I still don't want to work for Google - chris_wot http://infotrope.net/2012/10/29/no-i-still-dont-want-to-work-for-google/ ====== varelse I worked at Google for a short stint. I found the tech interview easy and I wasn't asked anything particularly ridiculous. I received a job offer shortly thereafter and accepted. As someone who regularly interviews prospective engineers at my current gig, I see no problem with expecting candidates to arrive prepared to answer algorithm questions or questions about their strongest programming language. Ditto for someone who wishes to change assignment within their organization, however they arrived there. If you're unwilling to provide proof you're not a bozo, you're probably going to be just awful to work with as well. However, the blind allocation policy at Google sucks, and it continues to suck. I came in as an expert in field D and therefore according to Google's magic sorting hat I ended up a natural for assignment in field Q. I tried my hand at it for several months, but as someone else has already said, bored employees quit: [http://www.randsinrepose.com/archives/2011/07/12/bored_peopl...](http://www.randsinrepose.com/archives/2011/07/12/bored_people_quit.html) In order to avoid that fate, I futilely attempted to get reassigned to something close to field D (really, B, C, E, or F would have been just peachy) and that seemingly got me flagged as trouble internally. Shortly thereafter, I got a higher offer to go somewhere else and left. However, unlike the author of this post, while Google recruiters regularly stalk my linkedin profile, none of them ever contact me, which is good. ~~~ ajross I don't follow why being an "expert" in D would make you "bored" of Q. If Q is simply a boring field to you, that's one thing (and the appropriate response is to quit and find something better). But it's not your employer's job to give you your dream assignment. If they need Q geeks and you're available, then that's what you're stuck with. That's what being an employee is all about. ~~~ seanmcdirmid As if geeks are so interchangeable like sprockets! I'm an expert in my field, I have a PhD in it, if someone asked me to do machine learning or data mining or image processing, it would be a huge ramp up time for me, say about 2 or 3 years to get to the point of expertise I have in my current field. It would be a boring 2-3 years, and something that I'm probably not very interested in. If there are other jobs out there that make use of my expertise and interest, why bother with something else? ~~~ quomopete why would you be bored learning something new? ~~~ michaelochurch One of the things I disliked about Google was the tyranny of At Google. If you didn't do it At Google, it doesn't count. You don't really know the first thing about anything unless you did it At Google. It's an extreme case of institutional arrogance. This explains the blind allocation policy. If knowledge that is not At Google doesn't count, then there's no point in matching people with their expertise or interests, because a Noogler by definition doesn't know _anything_. In July 2011, I did some research on the strategy of the Google+ Games team and saw that the going plan was doomed to failure and seriously risked such embarrassment as to kill the entire product. (Lots of Zyngarbage, preferential treatment to mainstream publishers.) I had some domain expertise from designing a game and spending a lot of time hanging around game designers. So it was pretty easy for me to come up with a strategy that had a damn good chance of actually succeeding. I posted it internally and got a huge amount of engineer support. The strategy was to establish a quality-centered community first by providing a platform for independent developers, integrate it with Hangouts, and become a center for the "German-style" board game sphere. The high quality starting community would establish G+ Games as cognitively upscale, creating a comparative brand advantage that would persist in perpetuity. By the way, the Google+ Games engineers also got wind of what I'd been proposing and they supported me. It was as obvious as it can be to humans (obviously, no one can predict the future) that this strategy would work. I got a ridiculous number of emails from engineers telling me that I was right on and that they wished they were implementing "Real Games" instead of giving ridiculous preferential treatment to mainstream publishers (who were throwing us mediocre product because they didn't expect us to succeed). What got me in trouble was that a lot of high-level people didn't like that an FNG had so much engineer support. I was a recognized domain expert, but not an At Google domain expert. There were no At Google games experts, because Google had never gone into the Games space before (and that's smart, because Google did extremely well on web search by being ideologically non-editorial, but for the games space _quality_ is so damn important that you _must_ be editorial.) So it came down to politics, because Google's At-Google bias rendered it incapable of recognizing domain expertise and discovering a correct decision. Finally I got an email to the effect of, "domain expertise isn't relevant here, deal with it. Besides, you're only a SWE 3." Well, fuck you very much. I don't see why job titles matter when you're about to lose millions of dollars and would have been _making_ as much had you listened to me. A year later, I was proven right, but it doesn't matter in the least. Google+ Games is a non-concern, and I'm not a part of Google. The lesson I learned from that ordeal is not to try to "save" a company from itself because you can't. You'll be seen as right, and possibly even lionized, long after you leave... but it won't matter in the least. Keep your head down, stay employed, and enjoy the middle-row seat to the show-- that "show" being the people in charge making fools of themselves. ~~~ kamaal Again, None of what you are saying seems to be a problem only at Google. These are really MegaCorp problems, and they apply to Google as well. I knew even from the beginning even 5-6 years earlier, when there was immense desperateness among geeks to work there. It was only a matter of time when all MegaCorp problems will eventually plague Google. ~~~ chii > It was only a matter of time when all MegaCorp problems will eventually > plague Google. this is really interesting - what is the cause of all these problems? Is it inherent in a hierarchical organization? Is it because you have people who are responsible for the output of others (a manager), but isn't able to actually control that output directly, but is only able to indirectly affect it (and not very well at that)? I think this issue of "control" is central. Facebook and valve seems to have their structure right (at least, the engineering department). But both is still young and small. ~~~ michaelochurch Open allocation seems to have one drawback. You still _do_ need (a few) managers and executives (not to order people around, but to keep track of the bigger picture) but it's hard to hire managers from outside into an open- allocation shop because typically they want promises of authority, and OA is directed through leadership rather than intimidation. Most companies move toward the closed-allocation end of the spectrum because they perceive a need to do so in executive recruiting. Most executives don't want to take a position where they won't have the power to unilaterally fire people. ~~~ hga Regrettably rhetorical question: And this is not a useful filtering function for most of the executives you'd want to hire? A company _must_ be able to fire people (I've been in ones that went down the drain because the founders were too nice to do this, or at least do it soon enough), but this sure sounds like Lord Acton's " _Power corrupts, absolute power corrupts absolutely._ " ------ c0nsumer Back in 2007 I got quite far through the Google hiring process, up to an on- site interview and being told to expect an offer the Thursday after I returned home. I was ready to move half-way across the country and get started living in the bay area, moving my life to there. That Thursday came and went, and I found out that due to some internal bar- raising I would not be receiving an offer. I stayed here in the Detroit area, moved up with my current company, married my wife, and settled into being quite happy here. Five years on I regularly have Google recruiters contacting me both via phone and email, asking if I'm interested in a position, and exclaiming how good the interview feedback was. When I decline to revisit any opportunity which would require me to move across the country, the recruiters are universally flabbergasted. Sorry Google, the time when I was excited to move across the country has passed. I still want to do interesting things, I'll just do them on my terms now. ~~~ simonsarris I always thought that strange too. I get a fair number of recruiter emails, and almost all of them seem to assume out of hand that relocation is the least of my concerns, when it probably _easily_ the #1 thing. Every reply I give[1] ends up being a cheeky way of mentioning that I have a _life and a home_ in a place and I do not intend to abandon it for a mere job. Perhaps my current job isn't the most interesting thing in the world, but I love my coworkers, my family, my friends, my little town. That's a lot for a job offer to compete with. [1] <http://simonsarris.com/blog/626-why-i-love-recruiters> ~~~ _pferreir_ Let's hope you can afford to do that for many years to come. When the choice is between relocating or unemployment, an average job offer suddenly gets much more interesting. Unfortunately, here in Europe things are starting to change for the worse. ~~~ vidarh > Unfortunately, here in Europe things are starting to change for the worse. For very geographically specific values of "Europe". Lots of places are on the contrary seeing things improving. Europe isn't a single entity. ~~~ _pferreir_ If a growth rate of 0.5% for the EU is not a good indicator of an economic slowdown, then you must be talking about Europa, Jupiter's moon. I know that EU < Europe, but we're talking about almost all of Europe's greatest economies here. Yes, I know some countries are doing better than others and that in some places unemployment is even decreasing, but taking into account the fact that Spain is on the verge of bankruptcy and countries such as France and Italy are being hit by austerity, I see no reason for optimism. ~~~ vidarh Averages across hundreds of millions of people easily mask that substantial areas and sectors are thriving. Yes, it sucks if you're in low level jobs in the weaker markets. If you're in other sectors and markets you will instead be hounded by recruiters. The place I currently work, we have to be lightning quick when hiring these days, because the candidates we want get snapped up within a couple of days. ~~~ _pferreir_ And does that make the european crisis a localized thing? If any, such anecdotal evidence shows us that it is possible for a company to thrive even in the worst of scenarios (which is a positive thing, I agree), but extrapolating any macroeconomic data from that makes no sense. Even considering that there are sectors that don't suffer as much (or even get some benefit) from this crisis and that you're working in one of those, everyone will be worse off in the long run. Or is your job the only thing in the economy that influences your well-being? ------ stroboskop Key quote. _Since I’ve been out of the Silicon-Valley-centred tech industry, I’ve become increasingly convinced that it’s morally bankrupt and essentially toxic to our society. Companies like Google and Facebook — in common with most public companies — have interests that are frequently in conflict with the wellbeing of — I was going to say their customers or their users, but I’ll say “people” in general, since it’s wider than that. People who use their systems directly, people who don’t — we’re all affected by it, and although some of the outcomes are positive a disturbingly high number of them are negative: the erosion of privacy, of consumer rights, of the public domain and fair use, of meaningful connections between people and a sense of true community, of beauty and care taken in craftsmanship, of our very physical wellbeing. No amount of employee benefits or underfunded Google.org projects can counteract that._ ~~~ timwiseman Google has historically both been reliant on fair use and the public domain and has defended such in court. I'm not saying they did it out of altruism, but substantial precedents that help solidify fair use come out of Google's activities in court. ------ raldi Please don't propagate the myth that Google asks prospective programmers questions like, "How many golf balls could you fit on a bus?" Google interviewers list the questions they asked when they write up their conclusions, and anyone who asked a question like that for an engineering interview would be immediately contacted by the hiring commitee and told not to do it again. ~~~ potatolicious Google was well known for logic brain teasers like that in years past - there was a time when Google recruitment was _defined_ by such questions. They've since stopped, and sadly it looks like Microsoft has taken up the mantle for stupid irrelevant logic questions. ~~~ papsosouid Were they well known for it because they did it, or because people assumed they did it? An awful lot of people immediately jump to "stupid brain teasers" when hearing "they ask technical questions". I have people write code and solve actual problems during interviews, and have gotten several "these kinds of riddles are a waste of time" responses from people who are offended by the notion that I want them to be competent. ~~~ potatolicious From my limited anecdotes of the time, I believe Google actually did it. We're not talking about programming-related brain teasers here, we're talking about non-programming ones. e.g., "3 light bulbs in a room" and "family crossing the bridge" being the classic examples. ~~~ debacle Never heard the family crossing the bridge one. Which one is that? ~~~ potatolicious You have a bridge that can only take a certain amount of weight at once without collapsing. It's dark at night and you have only one flashlight. A flashlight is required to cross the bridge, which is traversable in both directions. You have a family of people of varying weights (the exact numbers you'd have to look up) - determine the optimal way for the family to make it across the bridge. ~~~ debacle Thanks. I don't think that's a bad one. In fact, like the Towers of Hanoi, I think it has enough parallels with computer science and engineering to be a good interview question. ~~~ usea Indeed, it is a specific instance of <http://en.wikipedia.org/wiki/Knapsack_problem> ------ spindritf I may be making the problem worse but both this post and the comments section are some of the worst I've seen on HN -- from linkbaity title to incorrect possessive pronouns (I barely speak English and caught it while skimming), and then the deep analysis of anonymity on G+ completely based on one or two anecdotes with no relation to G+, anonymity, or even the Internet. Why is this submission at 80 upvotes? What value am I missing that others are seeing? ~~~ michaelt Some of what Google does is tech industry gospel - for example things like technical interviews. When there's faults with it, that's relevant to my interests, as I might want to avoid their mistakes at my own company. That's why I found the article interesting, at any rate. ~~~ papsosouid >When there's faults with it, that's relevant to my interests, as I might want to avoid their mistakes at my own company. Same here, except that is why I found the article to be a crappy, linkbait, waste of time. It wasn't about mistakes in google's hiring process, it was a personal rant about what the author dislikes about google. ~~~ subsystem You're not providing much value with your comment yourself. I thought it was an interesting personal story of what can happen the company you work for gets acquired by a large company. Just because it's not "top 10 hiring mistakes" doesn't make it uninteresting. The comments also include far more insight than usual into things that aren't necessarily found elsewhere. ~~~ fourstar How would he provide value on top of something that he perceives of not having any itself? ------ bmelton I tried, but I found that I can't really identify with the plights of some. Thankfully (for reasons I'd rather not be challenged to justify, except to say that it's apparently good as measured by others) I'm a fairly normal white guy with no marginal traits that might cause me to have different viewpoints than I currently do. This means that I support the notion real names on Google Plus, and I also believe that all speech should be free, but that you should also have the courage to attach your name to it. Yes, I understand that there are reasonable circumstances in which that would not be ideal, but perhaps due to my aforementioned luxury of being a 'normal' white male, I am ignorant to how much they would matter in real life. I am neither queer nor gender-queer, so while I am empathetic to their struggles, I just can't identify with what are possibly very real concerns about losses of anonymity, and as I've met people who are public with their genderqueer status who haven't been assailed or assaulted, I can't help but wonder if the fear isn't simply perceived fear or not. Regardless, aside from that (which again, I empathy with, but cannot relate to) the only other thing I took issue with in the article was the categorization of the autonomous vehicle as a 'geek toy'. It isn't, and that marginalizes an entire category of technology that has a very real possibility of changing the world in a very positive way to 'something SV types are wasting money on', which I take issue with. ~~~ paganel > I am ignorant to how much they would matter in real life. I am neither queer > nor gender-queer, so while I am empathetic to their struggles, I just can't > identify with what are possibly very real concerns about losses of > anonymity, and as I've met people who are public with their genderqueer > status who haven't been assailed or assaulted, I can't help but wonder if > the fear isn't simply perceived fear or not. I live in an Eastern-European country, which also happens to be an EU-member. One of my (female) colleagues told me how two or tree years ago she happened to see a trans-gendered person (I don't know what's the politically correct term) who had just been beaten up during that year's GayFest. "Blood was pouring out of his/her wounds", was what my colleague told us. That's why I down-votted you, btw, which I only do once every 2-3 months on HN. Not because I don't agree with you, which should not be reasons for down- votting people anyway, but because you kind of choose to view things through very narrow lenses, which is what the article was writing about all the way. ~~~ dpark I think you chose an exceptionally poor reason to downvote him. You downvoted him because you disagree with his views ("view things through very narrow lenses"), which is different from downvoting him because you disagree with him only in phrasing. Downvotes should be for comments that do not add to the discussion. Considering that most of the comments here are in response to him, his comment clearly added to the discussion. ~~~ Apocryphon Downvotes are for disagreements, too. ~~~ randomdata That just sounds like an easy way to rationalize your counterargument while avoiding being shown the error in your thinking. If a comment is worth disagreeing over, it is worth explaining why you disagree, otherwise nobody learns anything. Save your voting for the quality of the writing, not the arguments being made. ~~~ Apocryphon No, actually it has been discussed elsewhere in other comment threads that downvotes are used for showing disagreement, as well. I didn't come up with this myself; rather, it seems to be convention on the HN community. If you disagree with this, please feel free to downvote this comment. This isn't reddit, or YouTube comments, where karma is some sort of aspect of prestige. It should be freely given and taken away as part of the natural discourse. ~~~ prodigal_erik I once thought downvoting as disagreement was disapproved-of (as I would prefer), but in fact it is not: <http://news.ycombinator.com/item?id=2164087> ------ rmrfrmrf One thing I can't stand is "recruiters" who invite you to come interview with them. Oh, great, you "invite" me to come torture myself for months on end just so I can have a snowball's chance in hell at getting a job? This is 2012. If you want to learn about me, go on my website and read my blog posts, look at my current projects, and download my resumé. If you think I'm so great, make me an offer. Don't spam my inbox with "We're Hiring!" e-mails. ~~~ timfrietas As someone who does a fair amount of hiring, I would never hire someone off the strength of their (supposed) work alone. The ability to problem solve under pressure and their ability to communicate effectively with others are really important factors I won't get from looking at their Github account. ~~~ rmrfrmrf I agree in situations where the candidates are the ones doing the outreach. However, what I'm saying is that I don't want to be recruited by a company unless they've actually heard of me before and think my work is on par with what they want. It's bullshit to me to be spammed by a company that claims you'd be great with their team, only to be dumped into the same candidate pool as 800 other applicants. ------ mattdeboard I was interviewing at a startup in the Bay Area and the CEO insisted that since "being surrounded by people smarter than me and getting better at what I do" was a huge driver for my professional life, I'd be better served by working at Google. I did not understand that then, and I don't understand it now. I don't blame the guy for saying that I guess, and maybe he's right. That said I don't want to work at a megacorp doing software engineering, even if that megacorp is Google. It's just not for me. ~~~ nostrademons I worked at two startups and founded one before joining Google, and I have a friend that's been bouncing between working for and founding startups since he's left Google. I understand what the CEO was saying. In a startup, there's a lot of grunt work that has to be done and there's nobody to do it, and so you frequently end up taking on a lot of tasks that really don't teach you anything. In a big company like Google, a lot of that work is automated, or there's a dedicated team of people who _like_ doing that work, and you can just forget about it. Think about carrying pagers, repairing hardware, corporate IT, installing software, setting up your workstation, doing logs analysis, cooking for you, etc. It's less of a problem early on in your career when everything's new for you, particularly if you eventually want to found a startup where you have to do all that stuff anyway. A lot of engineers really want to be top of their field, though: they want to be the world expert in machine-learning, or in extraction of data from news articles, or in distributed parallel processing, and so on. You typically don't have the opportunity to do that in a startup, because you have to focus on the immediate business needs and on staying alive rather than on pushing the state of the art forwards. "Good enough" has to be good enough in a startup. ~~~ mattdeboard Interesting perspective. But like you said, at this point in my career I _want_ to know how to do all this stuff. Thankfully I'm in a position where I get/have to do basically everything, so I'm getting very broad experience. There are specific areas where I want to focus -- search, data analysis, infrastructure -- and my next job I'll probably be looking for a focus in one or more of those areas (each of which Google would be a good option but without a degree and at my age I doubt I'd get hired). Be that as it may, I really, really don't want to work in a huge organization. I hate bureaucracy, central authority, people I never see in the organization making policy for me, etc. I spent my entire adult life in the military up til a couple years ago. I think that has something to do with it. edit: This comment really crystallizes the type of thing I want to avoid, which means eschewing employment at Google, IBM, Microsoft,etc: <http://news.ycombinator.com/item?id=4713802> >Those were (are?) the designations for certain types of jobs over there. You might think that if you're hired into the company, your possibilities are wide open, but they aren't. System administrator flavored SREs (site reliability engineers) could only get other SA-flavored jobs, of which there are relatively few. Meanwhile, software engineer (SWE) type SREs could go into any other SWE job, of which there are many. If you were hired on as a SA-SRE as I was, then you have to do an internal interview to get to be a SWE-SRE. If you can't make it through that, you're stuck. I made it, and a friend did too, but I know people who didn't. I'm sure that makes them feel great, especially if they're already doing SWE type work in their daily jobs. I was told repeatedly there was no difference between the types, but found out the hard way when it was time to transfer from a toxic situation and there were few alternatives. It took over a year to finally get it all sorted out. ~~~ nostrademons Again, YMMV. I was hired as a UI SWE, told my manager (repeatedly) that I wanted to do more backend algorithmic work, eventually refused to do any more UI work, then taught myself enough about Google Search's backend systems that I became known as a full-stack engineer that knows everything. It probably delayed my promotion by about a year, but it was worth it, and it's not like anyone stopped me. I'm now doing basically product-manager type work, despite officially being a SWE. Again, nobody stops me. I've found that if you're producing stuff of value, you can basically move into another role simply by doing the work entailed by that role. I know two other SWEs that are de-facto PMs, and one SRE who became a SWE (and core LLVM contributor in the process). ~~~ ilf10 At google, you are thrust into a new environment, you are but a naked noogler, and your past a distant memory. Old, young, female, male, fat, skinny, likes sports, hates sports, prefers sushi, blah de dah, shut the hell up, no one gives a shit. It doesn't matter until you have proven yourself. A lot of problems come from when nooglers expect the world to do their bidding, but have nothing to show for it. At the end of the day, profit matters, lines of code matter, cleaning up and maintaining code matter, how much you help your coworkers matter, these things lead to you creating new features, these things matter and increase profitability, but you wouldn't know which new shit to add if you didn't earn your licks. Do your job, then improve things around you while doing your job, then we talk. Before I came to google, I was very successful, making kick ass fast, scalable systems, doing the job of many people, bringing glory to my managers but not to me.. Didn't matter though because for years I studied coding from what I would call masters, taking my licks, being humble and learning from anyone and anything. I didn't mention my past to my new team members at google.. Some of them ridiculed me at the first site of any minor bug I introduced, but inside I laughed like an insane hobo. The years of shit code I have refactored, bugs that drove me insane because they had to be fixed and no one else wanted or could tackle them.. I endured, so with a silent knowing I set out to rip apart the hearts of my naysayers with a blood curdling smile. Now I'm being promoted while those who ridiculed me are not, they in fact respect me and head my advice, usually because when they don't, they get ridiculed when their code is the source of design problems. What you did before google matters, not in how much you boast about it, but in that it gives you the skills to rise above the rest. Otherwise, you're just a blowhard talking about the olden' days that no one gives a shit about. In that way, I have found that google is a meritocracy, with people challenging each other, making everyone better, and it's something I never got at a previous employer. Before google, employers just took advantage of me. So, yeah, I would defend google to the death and recommend it to anyone who isn't a whining wimp. ~~~ mattdeboard Is this a joke? I don't get it. ------ tehwebguy The one thing I took away that really is a pain in the ass is recruiters hitting people up and asking them to apply. If a recruiter reaches out, in my opinion, they should already consider the person in question a genuinely attractive candidate and should be asking for a little more info, not asking them to start at step 0, they can do that on their own if they decide to. ------ bane Like many people I have issues with Google. But let's not be silly about things I'd still love to work there. Not out of any particular fanboyism, but because I've worked at plenty of places that are _not_ Google, and the normal day-to-day in a place like a large defense contractor are categorically worse than even the worst nightmare scenario I've ever heard about working at Google. ~~~ neilk I think almost anything is better than a giant defense contractor. I had a friend who worked at the place she called "the bomb factory" (in a completely innocuous IT role) and her stories were universally depressing. ~~~ nostrademons I have a friend who worked at General Dynamics for about 5 years and really enjoyed it. At least up until all the people he respected and liked working with quit. ------ halayli Interviews should go both ways. If the interviewer asks you for a "pop quiz" algorithm question you studied in college and forgot about it ever since, you should throw a "pop quiz" question back at them. Not for spite, but if they expect you to know the answer to their question you should expect that they know the answers to your question. ~~~ danielweber Depending on tone, I would treat a candidate who does this as either "really awesome" or "really an asshole." YMMV. ~~~ halayli The tone is a major player here. :) But every candidate has the right to know who'll they be working with on a daily basis. ------ clarky07 I was with this post until I got to: "Over time, I’ve come to consider that this situation is irremediable, given our current capitalist system and all its inequalities. To fix it, we’re going to need to work on social justice and rethinking how we live and work and relate to each other." Give me a break. Socialism isn't going to fix anything. Have you seen Europe recently? Capitalism and Democracy are the worst solutions for economies and governments, except for all of the others that have been tried. America and capitalism have given more opportunities to help people grow out of poverty into success than anything else in the world. If you think you are entitled to something, you're wrong. Get off your ass and do something productive and go take that thing you think you're entitled to. If you are sitting on your couch watching tv, the only thing you are entitled to is being overweight. ~~~ uriah The US actually lags behind most of western Europe and Canada in social mobility. Especially among the poorest. "European Socialism" is not that different from the US. In Europe you may get free child care, health care, etc. and industries that the government likes get subsidies. In the US, this assistance comes in the form of tax cuts. Tax exemptions for employers who pay for health care, the child tax credit, tax deductions for favored industries, etc. Including tax expenditures, the US spends more on social welfare than most of western Europe while doing a poorer job of ensuring it reaches the right people. ~~~ clarky07 I find much of that to be a huge waste of money. Why are we encouraging people to have kids with a tax credit? There were a lot of people in my hometown on welfare who would have more kids because you get more food stamps and welfare than what the kid would cost them. Brilliant. We are subsidizing terrible parents bringing kids into poverty because these idiots want more food stamps. Just because things are done a certain way doesn't mean it's a good idea. It certainly isn't a capitalist idea to have favored industries with subsidies or tax credits. America is far from being a pure capitalist society. It's a bastardization of capitalism and socialism, and while I may not be in the majority, I'd prefer pure capitalism over what we've become. I'm all for helping people who can't help themselves, but I'd prefer it be done in the private sector. I give more to charity than I pay in taxes, and the money I give does far more good than food stamps and welfare ever will. If I didn't have that money wasted on taxes (note I don't think it all is, just a lot of it. i.e. roads are a good thing, military, etc.), I could give a lot more to the places I think are more worthy. ------ brown9-2 I think it's fairly obvious from all of these stories about the Google recruiting process - being pestered fairly regular by different people who appear unaware or indifferent to your past experiences with Google, either as an employee/contractor/failed candidate - that the recruiters have some sort of quota system and need to keep feeding potential candidates into the interview process to meet their own metrics. Hence why a lot of times it seems like someone is being contacted out of the blue based upon an "online profile" that someone stumbled upon through desperate-seeming research. ~~~ SilvaR Correct! This is the main source of external Google HR issues. When there is a certain stretched goal to hire X number of people, this sort of situation tends to happen. ------ sxp >The very day after I blogged about that, my Google+ account was suspended, for using the name I was almost universally known by. Over the next couple of months, I campaigned tirelessly for Google+ to change its policies, working with the EFF and other advocates. My work was covered in Wired, The Atlantic, and a number of other mainstream press outlets. Obviously this was to no avail... FYI, this policy has changed: [https://plus.google.com/u/0/+BradleyHorowitz/posts/SM5RjubbM...](https://plus.google.com/u/0/+BradleyHorowitz/posts/SM5RjubbMmV) ~~~ greenyoda Note that they still want to have your "real" name in addition to the name that you're known by. The article says: "Over the next week, we’ll be adding support for alternate names – be they nicknames, birth names, or names in another script – _alongside your common name_." So someone who is afraid of revealing their true identity still can't use Google+. ------ pnathan I am curious about responses to skud's call for seeking social justice, it doesn't seem to be addressed here. Personally, I don't have a formed opinion on it... ------ tlow I'm totally with the author here, I just wish we could also include some of the other large companies like Facebook. ------ griftah She is building a social network for hobbyist gardeners. Apparently this is what humanity needs more than geek toys like self-driving cars and augmented reality sunglasses. ~~~ kevinh That's a bizarre critique. It's just more likely she would be working on youtube ad code than self-driving cars or augmented reality sunglasses, which most people would consider less constructive than a social network. Additionally, I think many people would consider Facebook more valuable than augmented reality sunglasses. If we're going to only build things that humanity "needs", we're going to needlessly limit our scope and put many people out of jobs. After all, our needs are considerably less than our wants. ~~~ rohern The original article specifically mentions self-driving cars and augmented reality sunglasses as "geek toys", FYI. ------ anonymous717 So here's my experience with Google's recruiting process. I exchanged a few mails with the recruiter, who offered a SWE position. Then two telephone interviews followed (with the recruiter and extensive interview with some developer), then they arranged an on-site interview. Apparently the 2nd tech phone interview wasn't needed. For the on-site interview I got some paper-mail, where the position had been changed to SRE.. So I'm thinking to myself.. right, this is going to be interesting. So I show up on-site, and 5 interviews follow, with a ~1hr lunch break in the middle. The 1st guy who interviewed me was a bit pompous [hey, he had a PhD!], but OK; the 2nd and 3rd guys were extremely arrogant; the 4th guy (my supposed team-leader) seemed to have had a bad day but was otherwise OK; the 5th guy was the ONLY with whom I felt I could build accord and have an engaging conversation. With him, it didn't feel like an "interview"; we were more like two equals talking about an interesting technical topic. The guy who I had lunch with was... interesting. Suffice it to say that I had the impression that he was on the verge of explicitly telling me NOT to take the position I was interviewed for. (As in, crappy job and crappy place to live in.) They tried to impress me with how every employee gets two big screens, a laptop of their own choice, how big systems they're working with, how good the food in cantina was, the fancy office space, etc.. Their attitude was in general as if they were interviewing a teenager whose "wet dream" was to work for google. I never found out what kind of project I would be working on. Everybody's attitude during the interview was "you ask, and we'll tell you if it's not confidential". The SRE position was briefly described as "root on google.com". It turned out that I'd also be required to be periodically on-call (since the position went from SWE to SRE underway), and that the people I'd be working with would be the same people who interviewed me. So I got an offer, a contract came in paper-post and I found out that I'd only be having ~15 workdays of paid vacation per year. Incidentally, in the country I was supposed to move to, it was allowed by the law to work NNN hours _unpaid_ overtime per year... Guess whether NNN matched (or maybe was even greater!) than the number of paid vacation days. I didn't take the offer, and it turned out to be a damn good choice. I'm rather sure that somebody without other hobbies or desires to have some free time to spend on things other than computers would have had a different subjective experience. [This post is deliberately vague about some details in order not to reveal too much about the persons involved.] ------ mamoswined The last recruiter I talked to was hilarious. Like the author here, I pulled my Linkedin profile and I feel like I am fairly hard to find. However, I have a food blog and I guess I might have mentioned some tech stuff there. So this recruiter calls me and starts talking to me about food, I guess as a way to get me interested, but he totally and hilariously botches it because it's pretty clear he knows nothing about it. I told him "You live in one of the best food cities in the world, so even if you aren't looking to recruit you should to to these restaurants/food purveyors." I sent him a list of them and told him I wasn't interested in a job. ------ zoidb at least you aren't bitter about it :) ~~~ NegativeK I'd call that more justifiable anger than bitterness. ~~~ eumenides1 I agree. Google should have done their homework before contacting a past employee (including contractors). Google as a large organization should know why people left their company and the context of situations around it. Some employees aren't vocal about it, but I don't think this is the case. ~~~ Kylekramer At risk of being callous, why should they? If an employee already dislikes a company to the point that a recruitment letter angers them, why should the company go out the way to accommodate them? The amount of effort it would take to throughly research every recruitment letter recipent seems like it would be a largely pointless endeavor with minimal returns. Most people aren't going react negatively to an email feeler and the ones that do are already a lost cause. ~~~ bathat >At risk of being callous, why should they? If an employee already dislikes a company to the point that a recruitment letter angers them, why should the company go out the way to accommodate them? So that they don't write blog posts like the one we are commenting on? So they don't actively try to convince other (possibly talented) potential employees to ignore your recruiting efforts? The entire comments section here seems to be "is Google a horrible company to work for and does Google+ suck, or are they merely ok?" If you're looking for new talent, wouldn't you rather have prospective employees asking "is Google a fantastic company to work for or an outstanding company to work for?" ------ xanderhud If you're really, really serious about getting them to stop bothering you, accept an offer from them but then don't show up for work. ------ interg12 TL;DR: Worked at Metaweb which was acquired by Google to develop for Google+. Left within a year because of a disagreement over policy against pseudonymity and the affect it has on victims of harassment. Received email from Google years later offering job. ------ rrbrambley I've been turned off by Google's recruiting style as well. I don't understand why the current state of tech recruiting is to blast out (almost) generic recruiting emails to everyone. It feels like this style of recruiting should have gone away by now. P.S. I just wrote a blurb about this last week: [http://robdotrob.com/post/33737357324/recruiting-for- bigco-p...](http://robdotrob.com/post/33737357324/recruiting-for-bigco-please- try-harder) ------ redler As a side question, is it common practice for Google to hire people using employment contracts with fixed terms? Is there an underclass of employees who join as a result of an acquisition, but who have expiration dates? ------ welebrity Fascinating article. A truly unique voice and perspective, although after reading all the comments and reviews, maybe their is a groundswell here!? Nonetheless, thanks for opening one brain, and keep on keepin' on! ------ pvdm Megacorps self-destruct eventually. ------ voltagex_ Strangely, the link is blocked by BlueCoat at my workplace. Can someone pastebin the text? ------ Morphling Man, I just wish a big company like Google was interested in me, but since I just mostly suck at everything that will never happen. ------ chris123 Ah, the Hacker News brag. Nice. ------ bborud Oh wow, I have not seen such a self-important wall of text in ages. Pull yourself together. ------ sidcool Hi, can anyone pls post a mirror? I am unable to access this from work. Thanks in advance. ------ mememememememe I agree with you on the unnecessary puzzle solving. I don't even solve Sudoku. How can I even come up with a reasonable solution in 45 minute? The only puzzle I ever solved with C++ and Java was minesweeper and it took me a month in each language. I do think algorithm design problems like fitting 1 million 8 digits into 1 M ram is an OKAY question. Google's scale is big, and they want to test you how quickly you can arrive an OKAY solution in 45 minutes, even if it's wrong. ~~~ Shorel Dude if you can really solve minesweeper, publish the paper and become famous and rich, because that shit is NP-complete. [http://www.cs.montana.edu/courses/spring2004/spring2004/curr...](http://www.cs.montana.edu/courses/spring2004/spring2004/current/513/resources/minesweeper.pdf) ~~~ Yen NP-complete problems aren't impossible, they just get very computationally expensive the bigger they get. OP would only be rich & famous if they had developed an algorithm that solved minesweeper in polynomial or better time. (i.e., a time complexity on the order of O(n^k), where k is some constant) Many NP-Complete problems are trivially solvable, just not trivially solvable in an efficient manner. - like the Travelling Salesman Problem. You can just try every possible combination. It's not fast, but it works. ------ cmccabe Yes, it's probably somewhat frustrating to be laid off by Google and then later receive recruiting emails. But Google is a big company. The right hand doesn't always know what the left is doing. Or maybe the recruiter thinks you really do want to come back to the fold. Who knows. It's pretty unfair to blast Google for "eroding" the public domain, fair use, and consumer rights. Google has been a champion of all three of those things, unlike some other companies I could name. Does it really matter if Google+ doesn't support anonymous comments? It's not like there's a shortage of places online to make anonymous remarks. ------ wei2012 Just another dreamer. ~~~ y4m4 affirmative!!
{ "pile_set_name": "HackerNews" }
Pony 0.33.1 - spooneybarger https://www.ponylang.io/blog/2019/12/0.33.1-released/ ====== SeekingMeaning Does Pony still work on Windows? There’s an issue about it[1] from 2015 and the one developer who uses Windows hasn’t committed to the repo since 2016. 1: [https://github.com/ponylang/ponyc/issues/434](https://github.com/ponylang/ponyc/issues/434) ~~~ rurban Sure. After that ticket a Windows build-recipe, CI and deploy was added.
{ "pile_set_name": "HackerNews" }
Generating Fake Conversations by Fine-Tuning OpenAI's GPT on Data from Messenger - Tenoke http://tenoke.github.io/blog/gpt-finetune ====== personjerry Sorry, but these generated conversations seem nonsensical, nothing like the OpenAI results. ~~~ Tenoke It uses their small model and a tiny dataset in comparison (and a small amount of training). It is more showcasing how much it learns (and doesn't learn) with those limitations in place. As well as allowing you to recreate it with perhaps a few minutes of work and less than an hour of waiting. Also, I wouldn't say the results are nonsensical - I think it has learned a lot more than a markov chain or a simple rnn but I agree that especially on the surface they dont even sound like they surpass Eliza by much. Moreover, it is significantly more apparent how much it learns about the different people you've talked to AFTER you run it on your own data. For a somewhat more novel/interesting result with fine-tuning GPT, I can recommend checking out gwern's post[1] on training it on a big poetry corpus. 1\. [https://www.gwern.net/RNN-metadata#finetuning-the- gpt-2-smal...](https://www.gwern.net/RNN-metadata#finetuning-the-gpt-2-small- transformer-for-english-poetry-generation) ~~~ gwern As it happens, nshepperd ran his finetuning GPT-2-small on our IRC channel. I'd tried it before with char-RNN back in 2015 or so, and I have to say, GPT-2-small trained way faster and better than my IRC char-RNN did. The samples also looked a lot better than OP's. I assume that's because he ran it for more like a day on a few hundred MB of chat logs. ~~~ gwern I didn't run on IRC because my GPUs were busy with poetry, FWIW: [https://www.gwern.net/RNN-metadata#finetuning-the- gpt-2-smal...](https://www.gwern.net/RNN-metadata#finetuning-the-gpt-2-small- transformer-for-english-poetry-generation) [https://slatestarcodex.com/2019/03/14/gwerns-ai-generated- po...](https://slatestarcodex.com/2019/03/14/gwerns-ai-generated-poetry/) [http://sevensecularsermons.org/on-the-significance-of- gwerns...](http://sevensecularsermons.org/on-the-significance-of-gwerns-poem- generator/)
{ "pile_set_name": "HackerNews" }
Cryptocats: The automated cat photo tool - aestetix https://www.cryptocats.me/ ====== dclaw This is great. It's the key to my server's life.
{ "pile_set_name": "HackerNews" }
OpenNews: Why Develop in the Newsroom? - knowtheory http://sinker.tumblr.com/post/27833803775/opennews-why-develop-in-the-newsroom ====== odacrem This is great stuff!!! It truly is an exciting time to be working in news/media. It is absolutely the case that mixing, mashing, hacking and building are a means-of-expression as vital to news/media as words and images. A key ingredient in our sauce over here at Vox Media (SB Nation, The Verge and soon Polygon) is a break with the old pattern where technology and development teams are considered (at the organization and conceptual level) to be utilities or tool providers for editorial and business initiatives. Rather, we embrace what internet native reporting and story telling require: an organization and mindset where editorial, business initiatives and technology are intertwined together like some kind of triple helix. This sounds like business speak but what it really is a tough challenge - one that is very exciting and one that is creating some pretty great opportunities for creative people and builders. The Knight-Mozilla OpenNews initiative looks like a great a way to get involved and I very much look forward to the ideas and things this fellowship program produces. Cheers! ------ dansinker These videos were created in part to help spread the word about the Knight- Mozilla Fellowships, which place developers, hackers, and tech-minded folks into newsrooms around the world to do open-source development, travel the world hacking the news, and much more. Tons of info here: mozillaopennews.org/fellowships/ The application closes August 11. ------ knowtheory Particularly with all of the wailing and gnashing of teeth over the supposed demise of news, it's worth hearing from folks who actually _work_ in these environments. More importantly, it's worth noting that they're doing some amazing work there. ------ 3amOpsGuy I think i've missed the point. How is this any different from sitting RAD (as in "rapid application development") guys in your business teams? Lots of companies do this already across many sectors. EDIT: It's still cool though. ~~~ rabidsnail You kill the project (or at least put it in maintenance mode) even if it gets traction. ------ rabidsnail What's the difference between news and information? Is it just freshness? Is the story format really the best way to convey the state of the world to the individual person on the other end of the wire, or is it just the best format for broadcast? My main frustration with The News is the story format. Not only do I have to spend the mental energy of reading the story, but I have to spend _more_ mental energy unraveling the narrative into its primary sources. Often that's actually impossible to do and I end up giving up and not reading the rest of the story. ~~~ dansinker I'd say that the difference between "news" and "information" isn't freshness, but relevancy. There's a lot of information out there that doesn't have relevancy to a user. When it does, that information transitions from simply being "information" into "news." In terms of story, we are as a culture oriented toward story and narrative. I think that's been true for forever. Cave paintings are remnants of ancient narratives (and of the point that information (there are buffaloes) becomes news (there are buffaloes near us, so we killed them) in the same way that newspapers reflect the remnants of more recent narratives and how all sorts of information delivery vehicles now contain narratives today. The key thing about The News today though is that it's still pretty reflective of old forms, and that's probably where your frustration with the story format comes in. Those old forms actually made sense when they were created, in part because they reflected the needs of the delivery systems. Of course now the needs of, say, telegraphs during the civil war (which is where the "reverse pyramid" system was created) is a silly way to make your news. Which is why what some newsrooms and devs are doing to blow up the story, to start thinking about how to put information across in ways that are exciting and new and reflective both of medium but also representational (and respectful) of the information itself is pretty damn kick-ass. I'm thinking of something like this: [http://www.nytimes.com/interactive/2012/06/11/sports/basketb...](http://www.nytimes.com/interactive/2012/06/11/sports/basketball/nba- shot-analysis.html) There's actually a lot of narrative there, but it's narrative in a very different way than we'd normally think about it. There's a ton of information there as well, but it's been touched in really thoughtful ways. Things like this, which are happening more and more every day, make me super excited. PS. WOW that was a long answer. ~~~ rabidsnail I think we're saying basically the same thing, but you're more patient than I am (and possibly more willing to compromise). ~~~ dansinker Haha. You are probably correct on both fronts. ------ jessevondoom I know Dan and a few of the others involved in this project and their passion is both amazing and warranted. Data is becoming and indispensable tool in the modern newsroom, and it makes sense that the pairing of seasoned journalists and creative developers is key to uncovering new stories today. Pretty exciting stuff, and I can't wait to see what they do next. ------ rabidsnail David Nolen is really involved in the Clojure community (core.logic, core.match, clojurescript). I had no idea he worked at the Times. ~~~ dansinker The talent the Times has managed to build up on its dev team is pretty unbelievable. ------ danso About 5 - 10 years ago, the status quo was to have developers be part of the business side of news orgs, that is, controlled by the ad people...not necessarily a bad thing, but a situation wholly different than what is being discussed in the OP, where hackers are helping to advance the news product and reportage. I'm having trouble thinking of other professions in which developers work inside the "prized" group of the organization -- in this case, editorial -- rather than as contractors or part of the IT group. Maybe programmers in the certain sectors of the financial industry? ~~~ dansinker I think you're right: There's a real shift now in newsrooms as they realize the value of having developers involved in the editorial process and the power in creating web-native news experiences.
{ "pile_set_name": "HackerNews" }
The Butler Didn’t Do It: Hello Alfred and the On-Demand Economy’s Limits - pavornyoh http://www.bloomberg.com/news/articles/2016-01-21/the-butler-didn-t-do-it-hello-alfred-and-the-on-demand-economy-s-limits ====== cryoshon Am I the only one who is extremely uncomfortable with wasting so much human potential to generate frills for people who already have enough frills? It isn't just getting people fresh out of college to be my butler, I'm uncomfortable with Uber/Lyft too: I've had multiple cabbies with Masters degrees, and even an admitted literature PhD. These people could be building us a new Great Society where learning and creativity are cornerstones, but instead, we look down on such people and in fact despise them for their lack of premeditated profit-seeking behavior during education. They are stuck driving cabs or butlering for someone who got a few lucky breaks instead of making use of their finer abilities, and it's frustrating to see them flounder in the precariat. ~~~ swalsh I view apps like this as a transition helper, eventually automated cars will replace uber drivers, etc. However until that tech is fully ready people have to do the jobs. These apps are like creating the interface before the back-end is ready. The real interesting financial possibilities in my opinion about the "gig economy" is the long term potential for when it's not people who are doing the "gigs". Of course people should be a lot more friendly to the people who unfortunately have to rely on these "gigs" to get by. ~~~ VLM So the TLDR is massive underemployment of some of our most expensively educated youth isn't a problem because soon they'll be unemployed? ~~~ swalsh One would hope that in a world where automated cars and advanced AI make unskilled employment especially difficult, unemployment wouldn't be a terrible thing. ~~~ vkou That would require a _complete_ overhaul of the social order... Unlike AI, however, nobody seems to be making any headway on that. ------ forinti This seems to me like the US becoming a lot like Brazil, where income inequality allows the middle class to have maids. Except, of course, americans would do it more efficiently and with a positive attitude. ~~~ crpatino I am almost afraid to ask, but what do you mean by "more efficiently" and "with a positive attitude"? I confess what I imagine out of those frases is roughly: "for less money/with no strings attached" and "in blissful ignorance of the economic violence that entails". ~~~ personlurking I don't pretend to know the commentor's intentions but, for reference, there's something in Brazil called the Stray dog/Mongrel complex which was thought up by a famous Brazilian writer, and which may explain it for you. Of his phrase, this writer said it's “the inferiority with which the Brazilian positions himself, voluntarily, in front of the rest of the world”. From a NYT piece written by author Larry Rohter in 2004: "Writing in the 1950's, the playwright Nelson Rodrigues saw his countrymen as afflicted with a sense of inferiority, and he coined a phrase that Brazilians now use to describe it: "the mongrel complex." Brazil has always aspired to be taken seriously as a world power by the heavyweights, and so it pains Brazilians that world leaders could confuse their country with Bolivia, as Ronald Reagan once did, or dismiss a nation so large -- it has 180 million people -- as "not a serious country," as Charles de Gaulle did." ------ Disruptive_Dave I wonder what the role of "errands" plays in the development of a person. These types of responsibilities must be beneficial to our every day existence, right? And if you can outsource all these alleged time wasters, what would you replace that time with? The final paragraph of the piece tells the real story, IMO. > Because, if I’m being honest, the real reason I never attend to all of these > matters isn’t because I’m too busy. It’s because on Saturdays, when my dry > cleaner is open, I’d rather sleep late and go to brunch. ~~~ sneak Now try it while running a relationship, a family member with poor health, a full-time consulting business, _and_ a startup. Lots of people can benefit greatly from an extra hour or two per week. ~~~ ghaff It's absolutely fair that it's worth it for some people to pay to free up a few hours per week. However, it's equally true that it's not necessarily worth it for others, especially if the outsourcing becomes as much a headache as doing it yourself. ------ Pxtl I know how this story ends: the app has basically arranged to introduce a service-provider to a client, and unlike Uber where you're getting services at different locations where different providers are needed every time, Alfred is providing the same service over and over again. This makes it trivial for the provider and the client to eliminate the middle-man. Service agencies of all kinds have been struggling with this problem since time immemorial, putting them on an app solves nothing. I'd actually rather a "pay for recommendation" engine where I can pay the app to just put me in touch with a reputable service-provider for whatever - like HVAC or an electrician or a local housekeeper. ~~~ sremani There is an app for that, its called Angie's list. ~~~ Pxtl _googles_ Holy crap that's an ugly website. Also, doesn't realize it's US only because it lists my Canadian city like everything's kosher but only accepts a US zipcode. ------ ghaff There are so many tasks that, by the time I figure out what needs to be done and make decisions, I'm at least half-way there. I do have a housekeeper who comes by occasionally but, honestly, sorting my mail isn't high on my list of life's issues and, even if it were, you're talking private secretary not on- demand gig. If I look at my list of tasks on my whiteboard, there's nothing there I can just tell someone to "make it so" and it would be done--or I would have already done so. (Or it's trivial stuff that I'll take care of when I can batch it with some other things.) ~~~ Justsignedup Most of life's small tasks require knowledge and involvement, otherwise they'd be automated already. Except spam mail. That is intentionally annoying. Fuck you US Postal Service (okay fine, this is their main source of income) ~~~ ghaff I tend to forget that I used to have to sit down, write out a bunch of checks, and post them in envelopes on a regular basis. Or the amount of telephone tag I used to have to play on a regular basis. ------ somberi I fit some of the characteristics mentioned the article (NYC, Income, etc) and I honestly wonder why single people need this much help. Manhattan, unlike most of USA is densely packed with grocery stores, laundry and most of them are open close to 14 hours at least. In addition, there are umpteen on-demand grocery suppliers (like Freshdirect, Amazon, Google, etc). Most apartment buildings have laundry machines to wash your clothes (not all of them do of course). I cook most of my meals, wash my clothes and still have a good quality of life (in addition to working hard). For some reason this cartoon comes to mind - [http://www.newyorker.com/wp- content/uploads/2015/10/KanekoMo...](http://www.newyorker.com/wp- content/uploads/2015/10/KanekoMoulyCoverStory- IvanBrunettisComfortFood_1-690-940-23151024.jpg) ------ VLM I looked up her servant's background... It takes four years at $65900/yr to get a degree to be a servant, where the average career is eight months long. Ouch. That indicates some severe structural economic problems.
{ "pile_set_name": "HackerNews" }
The teens who build the iPhones - cwan http://tech.fortune.cnn.com/2010/12/24/the-kids-who-build-the-iphones/ ====== ErrantX So, checking the video; looks like some reasonable improvements. And yet, CNN are spinning the story as "OMG, teens make your iPhone". If I recall correctly, I spent all summer whilst 17, 18 and 19 working shifts to earn money for uni/college. There is a determined effort here to paint conditions as terrible, third world and oppressive. I'm not so sure it's accurate. As I commented on the original story (when it first did the rounds) there is a huge cultural difference between here and there, so we must judge carefully in these matters. For example; the long hours are, sure, pretty sucky. But culturally that is how a lot of Chinese people work; long and hard. Those on my course at university were usually the last to leave the department at night, for example (3 or 4am sometimes). That's not to say anyone would _like_ to work this job and want long hours; but it's a misunderstanding to see that as immediately abusive. Even worse; China has a huge population problem, one we can't really imagine facing. What exactly are these people supposed to do to earn money? Yes, there is ample opportunity to abuse such a problem. And, yes, I am sure it does happen. This sort of reporting is essential in making sure it _doesn't_ happen wherever we can stop it. But it would be nice to see a little more common sense (i.e. rationality) applied to the _way_ it is reported. Of course, such an approach doesn't "sell papers" :) The real problem here is the huge socio-economic problem China is facing. Not buying Chinese goods might even be counter-productive. If grass roots action closes a Chinese factory, what is the impact, and who feels it? This is not a simple problem solved with outrage, but with calm forceful communication with the manufacturer. It seems to have somewhat worked in this case, going by the improvements. ~~~ glenra Factory work serves as an entry-level transitional job there, much like working for McDonald's here. The workers tend to be in the 16-20 age range; they do assembly line work for a few years, sending a little money home to help the family and then they move to some other company that pays more for more experience. Or get married, or who-knows-what. I'm sure I missed some of the nuance in the french narration but looking at the video I didn't see anything unusual or upsetting there. What, exactly, are we supposed to be upset about? Yeah, the hours are long, but electronic assembly-line work is a pretty cushy job by local standards. Beats heck out of working in most of those storefronts. Better lighting, better air conditioning, better pay, and you get to sit down while you work. ~~~ ErrantX Nope. I think you are dead one; and probably put the point I was trying to make even better :) ------ nnutter Video is private and the text is just a synopsis of a linked article. If you actually follow to the article you will find an editorial not journalism. It uses words to imply communism and describes an awful day such as getting up at 6am, leaving for work by 6:40, eating on the go, and working long hours for overtime. All while repeatedly says that the conditions are better than elsewhere in China. Yet the title is "Why I don't want an iPhone for Christmas" and not "Why I won't buy 'Made in China' products." ~~~ glenra The conditions on an electronics assembly line tend to be _amazingly_ better than elsewhere in China for a very simple reason: the _product_ demands it. When you are assembling electronics, the space in which you do it needs to be temperature controlled, humidity controlled, dust-free, well-lit, with the right tools available for each task at hand and a stable source of electricity (including backup generators) making sure this continues to be the case. Also, the workers shouldn't be too tired. Violate any of those conditions by much and the assembled product _doesn't work_ reliably, fails its acceptance tests, causing the company to lose business to somebody more meticulous. You might be thinking, "Okay, so there's decent lighting and air conditioning and dust control and power - big deal!" If you are thinking that, you have obviously never been to Dongguan in August. :-) ------ code_duck Do these articles focus on Apple because it is perceived as a luxury brand? What are the conditions like for workers who build HTC, Samsung or Motorola phones? How about every other product in China? I'd love to see 'the people who assemble the portable heater you bought at WalMart' or 'these are the Chinese prison slaves who make your gloves for Target'. ------ coryl Found a mirror for the video: [http://www.lavie.fr/actualite/monde/dans-l- enfer-du-high-tec...](http://www.lavie.fr/actualite/monde/dans-l-enfer-du- high-tech-chinois-23-12-2010-12724_5.php) ------ modeless I suppose these people would rather have us stop buying iPhones, have Foxconn go out of business, and have all these teens go back to their poverty-stricken villages for a life of subsistence farming? After all, capitalist factory owners are evil, while living off the land is virtuous and pure! Or something. ------ ez77 The West in general, and the US in particular, will soon reach an unsustainable point where simply too much stuff is made elsewhere. Naturally, just as for decades media (and hence people) looked the other way, the abusive labor conditions that made Walmart and our $400 computers possible will now become front and center of more and more news stories. This much, to me, is pretty certain. The consequences of clashing over this with China, however, are very much a mystery. ------ sjs Video is private, not really much to see here without it. ------ CBizzle Apple and HP can't demand that these workers work shorter shifts and have two days off per week?! WTF?!!
{ "pile_set_name": "HackerNews" }
Simulation of Ideal Gas Particles - proee http://www.eeweb.com/project/steven_hochstadt/simulation-of-ideal-gas-particles ====== dalke "To keep track of just 500 balls, we have to make (500 _501/2 comparisons)_ (60 frames per second) = 7,515,000 comparisons per second." This is incorrect. Space partitioning makes the distance search much faster. Ideal gas simulations started in the 1950s. One of the standard optimizations is to compute time until next collision and use a priority queue. Pop the smallest item off the queue and advance everything by that step, compute the collision, then do the 500 or so line intersection tests to figure out when the next collision will occur, and put that back into the queue.
{ "pile_set_name": "HackerNews" }
Show HN: JobRudder – Mission control for your career - gentleteblor https://jobrudder.com ====== gentleteblor Hi all, I built this thing. Please check it out, i'd love to hear what you think or answer any questions you have. ~~~ CatsoCatsoCatso Are you able to provide screenshots? I'm reading this page but I can't get my head round what this actually does. ~~~ gentleteblor Thanks for checking it out. Screenshots are a good idea. I'll try to get those added to the home page soon. Until then,. here's a few i just put on imgr. [http://imgur.com/X2z6bP9](http://imgur.com/X2z6bP9) [http://imgur.com/nLFtn3Y](http://imgur.com/nLFtn3Y) In general, JobRudder works like this: In: You add in what you're doing at work, your deeds and accomplishments Out: We generate performance reviews and/or resumes for you in one click. Plus analytics, search, tags etc.
{ "pile_set_name": "HackerNews" }
2013 - The Year of Wearable Computing - BinaryAcid http://www.singularityhacker.com/post/44662876695/2013-the-year-of-wearable-computing ====== simonh If Steve were around he'd be telling us all how nobody wants to wear a computer.
{ "pile_set_name": "HackerNews" }
Haskell Skyline - pykello http://pykello.github.io/programming/2015/12/12/haskell-skyline/ ====== Chris_Newton Perhaps I’ve misunderstood the problem, but the second solution shown here doesn’t seem to be correct: it always adds a trailing 0-height entry to the skyline, yet there could be an existing building touching or overlapping the newly added building, whose height is then lost. For example, the code given appears to make skyline [(1, 2, 3), (2, 1, 4)] evaluate to [(1,2),(3,0),(4,0)] when if I’ve understood correctly it should be [(1,2),(3,1),(4,0)] ~~~ pykello You are right. Thank you. Going to correct this. ------ jzwinck I was given this exact problem at a Google interview. In fact it was virtually the only high level, real code programming exercise I was asked to do in many hours of interviews. ~~~ ilurkedhere I too got this problem, though not at google. White-boarded it and came up with the n^2 solution. ------ bschwindHN For those of us not particularly skilled at Haskell, could you add some comments on the actual code, or at least explain some of the lines? I get the English part of the algorithm, but I don't think I'm up to speed on some of the syntax being used in your solutions. ~~~ pykello "foldl" works like aggregation in databases. When you say "fold func init values", the result is calculated as: result = init for value in values: result = func(result, value) return result So, "foldl add_endpoints [] bs" will translate to: result = [] for (x1, h, x2) in bs: result = add_endpoints(result, (x1, h, x2)) return result If you expand the 3rd line, you get: result = [] for (x1, h, x2) in bs: result = result ++ [(x1, height x1), (x2, height x2)] return result where "++" is the list concatenation, and "height x" is a function which finds the skyline height by finding the tallest building with "x1 <= x && x < x2". I think the 2nd solution should be easy to understand if you understand the 1st solution. Probably the only new thing is that I've sorted the buildings by height before passing them to "foldl". In the 3rd solution, the following lines: skyline bs = (skyline (take n bs), 0) `merge` (skyline (drop n bs), 0) where n = (length bs) `div` 2 mean: first_half = first_n_elements(bs, length(bs) / 2) second_half = remove_first_n_elements(bs, length(bs) / 2) result = merge ((skyline(first_half), 0), (skyline(second_half), 0)) You may wonder what are those 0s? In the merge function I need to keep track of current height of each half, and the initial height of left and right skylines are 0. Then, in merge function: merge ([], _) (ys, _) = ys merge (xs, _) ([], _) = xs means return the other list if any of the lists become empty. The underscores mean a variable whose value is not important for us. We don't care about the current height values here, so I've put _'s instead of real names. Then the other cases: merge ((x, xh):xs, xh_p) ((y, yh):ys, yh_p) | x > y = merge ((y, yh):ys, yh_p) ((x, xh):xs, xh_p) | x == y = (x, max xh yh) : merge (xs, xh) (ys, yh) | max xh_p yh_p /= max xh yh_p = (x, max xh yh_p) : merge (xs, xh) ((y, yh):ys, yh_p) | otherwise = merge (xs, xh) ((y, yh):ys, yh_p) First case, "x > y" simply swaps the two args. This ensures that in the following cases we have x <= y. Second case is probably easy to understand. In third case, we know that x < y. Just before reaching x, skyline has height "max xh_p yh_p". When we reach x, height of skyline changes to "max xh yh_p". If these values are not equal, we have a height change. So we a construct a new list with head "(x, new height)" and the result of merging the rest of skylines. If the height doesn't change, we just ignore the change at x and continue with the rest of skylines. ~~~ iheartmemcache Upvote for a really accessible explanation. I wish I had posts like these to read when I was learning Haskell half a decade ago. You've got a knack for conveying concepts! Write some texts or at least a blog! ------ eru I wrote a really nice merge function (in Haskell) when I helped a friend prepare for an interview. Perhaps I should send it to the author. ------ heinrich5991 I believe integer sorting doesn't have a lower bound of O(n log n), because you can do more with them than just comparing. ~~~ enedil Of course. But nowadays, space complexity is more and more important, which makes every sorting algorithm that is > O(n) not sufficiently efficient.
{ "pile_set_name": "HackerNews" }
Trump to explore entering Pacific trade pact he once called 'a disaster' - neaden http://thehill.com/homenews/administration/382867-trump-orders-officials-to-look-into-re-entering-tpp-trade-pact ====== msie And I'm sure the other countries will welcome the US with open arms. Sadly, Trump will get his way and there will be no consequences for bowing out of the trade pact earlier. ~~~ igravious From Wikipedia: “The Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP), also known as TPP11[2][3][4] is an agreed in principle trade agreement between Australia, Brunei, Canada, Chile, Japan, Malaysia, Mexico, New Zealand, Peru, Singapore and Vietnam. The CPTPP incorporates most of the Trans-Pacific Partnership (TPP) provisions by reference, but suspended 22 provisions the United States favored that other countries opposed, and lowered the threshold for enactment so the participation of the U.S. is not required.[5] The TPP was signed on 4 February 2016, but never entered into force as a result of the withdrawal of the United States.[6] All original TPP signatories, except the U.S., agreed in May 2017 to revive it[7][8] and reached agreement in January 2018 to conclude the Comprehensive and Progressive Agreement for Trans-Pacific Partnership. The formal signing ceremony was held on March 8, 2018 in Santiago, Chile.” TPP11 is _less favourable_ than TPP for the US because 22 provisions which favoured the US were removed! Not only that, by abandoning TPP the US actually made it possible for the remaining 11 nations to drop the provisions that they felt favoured the US (perhaps because they were strong-armed into accepting them?). “On January 25, 2018, U.S. President Donald Trump in an interview announced his interest in possibly rejoining the TPP if it were a "substantially better deal" for the United States.” Given that TPP no longer exists, how is the US to join? Given that TPP11 is explicitly worse for the US than TPP (because the US abandoned it) how does the US think it is going from negative back to status quo and from there to positive territory? Presumably it must convince 11 nations, some of them sizeable, to renegotiate a deal that would be to their collective detriment. In word _bonkers_. Less inflammatory _out of touch with reality_. In other news. “In January 2018 the United Kingdom government stated it is exploring becoming a member of the Trans-Pacific Partnership to stimulate exports after Brexit in March 2019 and has held informal discussions with the members.” (T1) "I know, let's leave the world's largest common market on our doorstep!" (T2) "I know, let's join this other trade area half way around the world!"
{ "pile_set_name": "HackerNews" }
Ask HN: How do I make sure my website is GDPR compliant? - jbuttwerworth Hey folks,<p>I have a side project (a web app) which requires login via Facebook and Google to work. I intend to release it publicly but before that I want to make sure I&#x27;m GDPR compliant.<p>The web app stores minimal info for the user such as the email (encrypted) and their first name (the data is provided from the social networks I mentioned above).<p>I looked online for help on how to make sure a web app is GDPR compliant but it&#x27;s confusing. Is there someone here with actual experience on this who can provide some guidance? Is there an official guide in layman&#x27;s terms on how to do that?<p>Thanks ====== Nextgrid Sounds like you're already compliant. Storing metadata about a registered user is perfectly acceptable under the GDPR for functional & legitimate interest purposes. I would recommend adding a way for a user to delete their account, unless the third-party login provider gives you web hooks on when OAuth consent is revoked in which case you can use that as the signal to delete all PII stored locally. ------ runningmike Gdpr compliance does not exist and is a long living fad sold by consultancy companies. To make sure you align with gdpr regulations: Just do not store personal data of customers. Never. Most important things to know [https://nocomplexity.com/gdpr-principles/](https://nocomplexity.com/gdpr- principles/)
{ "pile_set_name": "HackerNews" }
'NSA should come clean about domestic spying': NYC Police Commissioner - discostrings http://www.nypost.com/p/news/local/nsa_should_come_clean_ray_kelly_dfAKlqJ4keYDNiJqANhIMO ====== dmix Had to look this up: > The NYPD secretly spied on Muslim organizations, infiltrated Muslim student > group and videotaped mosque-goers in New Jersey for years, it was revealed > in 2012. The NYPD said its actions were lawful and necessary to keep the > city safe. [http://gothamist.com/2013/05/20/ap_obtains_texts_from_the_ny...](http://gothamist.com/2013/05/20/ap_obtains_texts_from_the_nypds_mus.php) [http://gothamist.com/2013/04/30/bloomberg_dont_mess_with_nyp...](http://gothamist.com/2013/04/30/bloomberg_dont_mess_with_nypd_becau.php) Look's like they should focus on their own problems. ~~~ daniel-cussen To be fair, in a way they _were_ focussing on their own problems. Like...the whole Twin Towers thing. ~~~ dmix The NYPD declined the need an Inspector General or a federal monitor after it was discovered. They are conducting a secret surveillance based on a groups religious affiliation without public scrutiny and want to defend the power to continue doing so. ------ tsaoutourpants Pot, kettle, black. ------ Glyptodon "I think the American public can accept the fact if you tell them that every time you pick up the phone it’s going to be recorded and it goes to the government." I can't believe this guy. At least he thinks it should be public, I suppose. ~~~ rhizome Because he wants to do it, too. In a just world this guy would be out of a job. ------ rl3 “I think the American public can accept the fact if you tell them that every time you pick up the phone it’s going to be recorded and it goes to the government,” Kelly said. “I think the public can understand that. I see no reason why that program was placed in the secret category.” The whole media ruckus over Snowden is simply because people don't like secrets being kept from them. They're actually completely cool with the whole total surveillance thing, though. ~~~ discostrings That's actually part of the media ruckus. There's a lot wrong going on with these programs, and people have different objections. Some object to the surveillance itself, some object that it was kept secret but approve of the surveillance, and some object to the way the executive branch is doing this without meaningful input from the other branches of government. And of course, some simply object because of the executive who is doing it. For those who object to the surveillance: it's really important to engage others about the privacy/surveillance debate right now. As those who only oppose the secrecy or the process aspects of what's going on become satisfied, interest will dissipate and it'll become much harder to engage people on the surveillance issue itself. It's a really interesting opportunity to actually engage people about privacy and surveillance without many people instantly deferring to the status quo. ~~~ rhizome How about we just start at whether it's legal and Constitutional? Everything else follows from that. ------ betterunix s/come clean about/let us use their system for/ ------ woah “He tried to give the impression, it seems to me, that these system administrators had carte blanche to do what they wanted to do,” he said. “I think it’s a problem if that’s in fact what’s happening.” Is it, in fact, possible to administer a system without having full access to that system? ~~~ bilbo0s Well... Yes and no. For instance, you can give a team full access... and at the same time have access control only grant that full access when two different people enter their credentials. Also, audits should be more than cursory... and should be performed by a DIFFERENT organization. They can operate in a more secure fashion than, I suspect, they currently operate. But when you consider the full matter.. You begin to see that the message, on a national level, is being managed in such a manner that "solutions" presuppose the EXISTENCE of these databases. If it were up to me, I would try to keep the conversation squarely centered around the deconstruction of this surveillance infrastructure. But we need to find a Senator who can credibly do that in the absence of Feingold... and this is a problem. ------ forgotAgain Just goes to show that regardless of all the claims of oversight, it all comes down to people. You combine that with the fact that power corrupts and it's obvious what a danger the NSA is to America.
{ "pile_set_name": "HackerNews" }
Manifesto for Responsible Software Development - jraedisch http://manifesto.responsiblesoftware.org/ ====== javaguy98 ACM and IEEE collaborated and agreed on a Software Engineering Code of Ethics decades ago [http://www.acm.org/about/se-code](http://www.acm.org/about/se-code) Can we not reinvent this particular wheel please? Codes of Ethics aren't JavaScript frameworks. ~~~ GrumpyYoungMan The National Society for Professional Engineers, which, among other things, is the body that runs the exams for licensing professional engineers in the United States, has a similar code of ethics: [https://www.nspe.org/resources/ethics/code- ethics](https://www.nspe.org/resources/ethics/code-ethics) ~~~ clifanatic And a professional licensing infrastructure (hint hint) ------ bluejekyll > I will not develop software that is intended to violate human rights and > civil liberties. This one doesn't work especially in regards to cryptography. For example, if I develop the greatest end-to-end cryptography, I can almost guarantee this will be used by both good and bad actors. There's nothing I can do about that. IMO, a manifesto like this should really be much closer to the hippocratic oath; which is to say that my responsibility belongs with the patient, disregarding personal feelings toward the patient. So for us, that should be to write code that does the best job that it can; b/c at the end of the day we can't control who uses it. ~~~ DanHulton I would argue that cryptography wouldn't violate that aspect. The intent isn't to violate human rights or civil liberties, even though it can certainly be used that way. So too can a text editor or a printer driver or scheduling software. I think you're being a bit broad. To me, it reads fairly clearly as a stand against code whose specific intent is unethical, not it's potential and unrelated uses. ~~~ bluejekyll Yeah, I get the intent argument, but I don't think it has a good separation of concerns, for lack of a better term. It's written in such a way as to state that if I know it will be used for bad purposes, then I should not write the software. And as you stated, even a text-editor could be interpreted that way. I could even turn this around in a different way, let's assume I decide that I want to help catch bad people. I do this by writing some software that helps deanonymize connections. This can be used to help stop DDOS attacks, track down sex traffickers, etc. So it's intent is good, but it will also be used by Iran, Syria, Russia, China, USA et al. to track down dissidents. I believe this software should be unethical by these standards, b/c of it's potential misuse/abuse, it was written with good intent! So is that unethical or not? The reason that I dislike that rule, is that it places the use/abuse on the developer, it's really the operator of the software that is at fault. Right? ~~~ Bartweiss > The reason that I dislike that rule, is that it places the use/abuse on the > developer, it's really the operator of the software that is at fault. Right? I think that in practice I disagree with this. Certainly final moral responsibility lies with whoever misuses a tool, but on a practical level people should be aware of the primary or predictable results of their work. If you create a cryptography system and release it publicly, you should be aware that it will be used by people hiding unethical things. If you build a deanonymization tool, you should be aware that it will be used for surveillance by people with ill intent. These are statistical certainties - if your tool is good, it will be used in these ways, and you can't claim surprise when it happens. It's like the stochastic terrorism question, where you can't know what an individual will do but you can easily predict that a system will produce violence somewhere. None of this means you shouldn't do those things. Inventing TNT wasn't evil just because it's been used for violence, and when it comes to something like cryptography there's a real case for "this will be built eventually, so we have to live with it". With tracking, I sometimes feel the moral question is greyer, especially since the results are system-dependent and not 'inevitable'. So yes, build these things, and accept that they'll be used for all sorts of purposes. But do consider the risks, and be aware of degree of harm. Building a tracking system that Iran might use someday is very different from building one _for_ Iran, knowing what will be done with it. ------ coldcode "I will not develop software that is intended to violate human rights and civil liberties." is a nice concept but every country's idea of what these things is both highly overloaded and mutable. What you find a violation another programmer might find acceptable. Of course you may be free to agree and someone else might argue the opposite. ~~~ NilsLoewe "What you find a violation another programmer might find acceptable" That is exactly our intention. We want every one to think about and reflect his or her own values. We believe there can't be the single one truth for all. But the real danger is, where people don't think at all about what's right or wrong. ~~~ exstudent2 Do you really think that this is needed? Are developers not operating within their own ethical guidelines? Seems weird to assume we/they aren't. ~~~ matt4077 There were some Yahoo employees who implemented the firehose delivering their users' data to the government. Considering the general political climate in the US tech scene, I can't believe they thought they were doing something good. They either didn't think about it at all, or decided their job was more important to them. Symbolics like this pledge might help with the former, because it raises the idea that software may have an ethical component. For the latter: its kind of sad that these people at yahoo – being incredibly more employable than average – still couldn't muster the courage to say no. Could've done it quietly and found a job somewhere else. Could've done it publicly and be a hero (and probably still find a job – some companies out there would've wanted to support them, or use their hiring as marketing). ~~~ ryandrake > For the latter: its kind of sad that these people at yahoo – being > incredibly more employable than average – still couldn't muster the courage > to say no. Could've done it quietly and found a job somewhere else. For this tactic to achieve its goal (that the software doesn't get written), you'd have to ensure that literally zero developers now and in the future would be willing to write the code. As was discussed in a previous thread [1], when faced with the prospect of writing "unethical code" the consensus seems to be even if a few developers take an ethical stand, there will always be some other developer out there without such strong convictions willing to write the code instead. I'd also like to point out that being employable does not necessarily mean you're in the position where you can put your job on the line for something. I consider myself an "employable" software guy but I've got bills to pay, and 3 months of unemployment while I find my next job is not exactly compatible with that. 1\. [https://news.ycombinator.com/item?id=12965589](https://news.ycombinator.com/item?id=12965589) ~~~ Bartweiss Obviously a vocal departure is the easy way to enforce public outcry, but I'm not sure a quiet one is as useless as this suggests. There are certainly some fundamentally-broken things that will happen as long as anyone is willing to do them - I don't expect spammers and ad fraudsters to disappear because a few programmers quit in a burst of ethics. But Yahoo wasn't dependent on unethical behavior, they had a (somewhat) viable business doing reasonable work. Those are the companies susceptible to quiet moral stands. They're not dependent on ethical lapses to survive, so they can push back on them and continue to exist. And they're big, wealthy companies that spend lots of money on getting good programmers, so if some of those talented programmers say "screw this" and quit, it's a problem even if they can replace the missing bodies. None of this addresses the last point - I don't expect people to starve over this, and I imagine most would stick around until they had a new job lined up. But I do think that pushing back on these companies can help even if there are people willing to do the dirty work. ------ ams6110 This is a nice feel-good but utterly meaningless. ------ aethertron Seems to have good ideas, but focuses on the negative. Here are some high- minded principles about what sort of software we _should_ be building: [http://www.loper-os.org/?p=284](http://www.loper-os.org/?p=284) ------ rodrigosetti I will not sign the manifesto. I prefer to do whatever I want with software, thanks. ~~~ dbg31415 We call this the Chaotic Neutral Manifesto, and it's grand. ~~~ geofft It's not a bad analogy; a software engineer is in many ways modern society's equivalent of a spellcaster. And, while a high-level chaotic neutral spellcaster might be great to have in your party, it is quite legitimate for the average village to be terrified of one who's just getting a drink in the tavern. ------ kolbe 2 is problematic, because it applies to a moving target of "human rights and civil liberties." I may agree with society's definition of these terms today, but the world changes, and with it the definition of "human rights." And while 4 is applicable for many programmers, it's not even close to universal enough to warrant a blanket pledge. ------ agentultra A manifesto is not really useful unless it's enforceable. I can sign this but I'm still compelled by my employer to act as they say with the threat that I could lose my job if I disagree and they will hire someone who will do it anyway. My employer _wouldn 't_ do that I'm sure but it's no reassurance to me or anyone else. A professional licensing organization has more weight. I would be beholden to the manifesto and would have an organization with the weight and resources to litigate if my employer tries to fire me to get around my ethical responsibilities. Fortunately there is one such program developing in the country in which I live and I'm doing what I can to pursue the education requirements to be licensed. I believe that liability and professional responsibility are going to become necessary to develop many kinds of commercial software. ------ mstodd Too bad a manifesto that states "I will not waste the most valuable resource: time" can't exist ------ NilsLoewe Hi, I just saw that someone posted our manifesto here. Thank you! If you have any questions, please ask me here. I will try to answer them. Best regards, Nils ~~~ throwaway729 First, thanks for setting this up! It seems like everything in the manifesto boils down to #1 ("I will act according to my conscience"), because everything else is vague enough to be debatable. Some engineering disciplines augment high-level oaths (such as this one) with concrete, enforceable codes of conduct. They also put in place a barrier to entry, so that phrases such as "will do my very best" carry with them the force of assumed competence -- i.e., there's a limit on precisely how bad "best" is allowed to be. Do you support the professionalization of Software Engineering? Have we reached a point where software is important and pervasive enough that legally enforceable professional codes of conduct are now reasonable? I ask because it seems odd to me that someone would think something like this pledge is necessary and good, but also not support treating Software Engineering as a proper engineering discipline. So I'm very interested in your reasoning, especially if your answer to the above question is "no". Again, thanks! ~~~ NilsLoewe "It seems like everything in the manifesto boils down to #1 ("I will act according to my conscience")" Yes I think software engineering has reached this part of professionalism at many places. There, these specific codes of conduct are already in place (DO-178b, IEC 61508, ...). But software is reaching more and more places and the barriers are lower every day. I believe that we need the discussion about ethics across all layers of professionalism. This manifesto is of course not enforable. But it is meant as a basis for discussion. ------ jraedisch I would like there to be some promise of not wasting people's attention. Or maybe that is one of the "resources"? ~~~ crawfordcomeaux I'm developing a scientific theory of design that includes preserving attention. I'd like to hear more of your thoughts on this. ~~~ worldsayshi Nice. I've had some thoughts related to this idea. How do you test such theory? I think there is some profound idea lurking where "cache-misses" and convenience are overlapping. We can only keep so much information in our short term memory. The more convenient an interface is the more space is left for information that is strictly important for the task that the user is trying to solve. Also, as humans I believe we use association for populating our "cache". So if we have an intuitive interface it helps us to "pre-cache" information because it will trigger the right associations at the right time. Distractions will trigger the wrong things to be "cached" by association and this will slow us down. Yet another critical factor is derailment. Or the opposite of being in the zone. If we miss the cache too much we tend to start analyzing ourselves rather than the problem. A second layer of distraction occurs. Or perhaps avoidance behaviors are triggered which further distracts us. ~~~ crawfordcomeaux The current plan is to build an emotion recognition machine ([http://eqradio.csail.mit.edu](http://eqradio.csail.mit.edu)) and experiment on myself, combining that input with video and activity tracking. ------ surfmike What does not collecting too much data mean in practice? It seems a lot of analytics would be in the grey area, esp. when trying doing reverse geo lookups. A more simple example: can I use Apache which gathers people's IP addresses, browser info, referring links if I don't necessarily know how I'll use it? ~~~ adrianN You can get in trouble in the EU if you log IP addresses without explicit consent: [https://www.eff.org/deeplinks/2010/06/european-officials- goo...](https://www.eff.org/deeplinks/2010/06/european-officials-google-yahoo- microsoft-breaking-law) ~~~ surfmike Interesting, so basically everyone running web servers with default settings is breaking the law. ------ dooglius > I will do my very best to prevent the waste of energy and resources What > determines whether something is a waste? Is writing a game that use lots of > power on GPUs a waste, because it doesn't further society? ~~~ NilsLoewe Could you reduce the power consumption by just optimizing the most critical parts? If yes: Not doing it would be waste in my view. Is writing or playing a game waste? No (in my view). ------ jsnsimpson Lol @ the sql injection attempt ------ banach I wish I had the moral fortitude to sign and stick to this manifesto.
{ "pile_set_name": "HackerNews" }
Ask HN: What are some challenges in batch processing? - Something1234 I just saw a comment along the lines of &quot;It can take a day to start a job on a supercomputer cluster&quot; So what are some challenges when it comes to batch processing systems. ====== fiedzia That you are not the only one who wants to use them.
{ "pile_set_name": "HackerNews" }
Auto-Scaling Web Sites Using Amazon EC2 and Scalr - iamelgringo http://developer.amazonwebservices.com/connect/entry.jspa?externalID=1603 ====== progdan Open-source projects need contributors not complainers. From what I've seen on the Google discussion forum there are dozens and dozens of people using Scalr on their own installs and through the scalr.net service provided. Seems they are using it quite effectively. ------ ivankirigin Is anyone using this? How portable is the configuration? ~~~ seekely Take this with a grain of salt, as I may have judged this too quickly, but I was not impressed. The tutorial linked is indeed pretty solid and it was trivial to get Scalr up, but the product itself seems half finished (which to be fair makes sense since its very early at v.5). I also took a gander at the code, and I was equally not impressed. I am excited at the potential of Scalr, and maybe in another few months to a year I'll take another look, but for now I would still rather just manage EC2 by hand than become dependent on Scalr. ~~~ davidn Scalr is still taking shape as can be expected with any new open-source project. There is a new version that will be released soon which will go a long way in terms of stability and robustness. Keep an eye out for it in a month or so. ------ avner there is no such thing as immediate "automatic" scaling (at least not in the sense as amazon makes it appear). That said, Yes Scalr is pretty cool, in fact it is almost a godsend for high traffic apps out there... but take this with a grain of salt- establishing server farms using pre-built generic AMIs to load balance your databases can come at a cost...literally. Scalr still has some way to go but it sure does look promising. ------ wrigley Were looking at using Mosso.com instead of AWS, seems just an easier option for us requiring less technical knowledge ------ mtw does someone know how much time it takes to start a new instance? and also the delay between 2 pollings? I'm asking since with websites like digg, traffic can 10x for your website in 60 seconds.
{ "pile_set_name": "HackerNews" }
Manufacturing and the Great China Firewall - radarcontact Hi All, ShinePay factories are running proprietary software to enable manufacturing operations in China. We&#x27;re having a difficult time maneuvering around the great China firewall. If you have any experience with this I would love to bend your ear. Please &amp; Thanks in advance.<p>VPN is working via the browser but not via python with requests library. ping in terminal doesn&#x27;t work too. But they can access my AWS hosted server via the browser ====== yorwba If the exact same request is working in the browser but failing for the Python program, they're probably not tunneling all traffic on their computer through a VPN, but have only configured the browser to use a proxy. You should be able to make the requests library use the same proxy. IIRC, there's an environment variable you could set to achieve this without code changes.
{ "pile_set_name": "HackerNews" }
Atomically thin LED opens the possibility for invisible displays - ForFreedom http://news.berkeley.edu/2018/03/26/atomically-thin-light-emitting-device-opens-the-possibility-for-invisible-displays/ ====== pjc50 > By laying the semiconductor monolayer on an insulator and placing electrodes > on the monolayer and underneath the insulator, the researchers could apply > an AC signal across the insulator. During the moment when the AC signal > switches its polarity from positive to negative (and vice versa), both > positive and negative charges are present at the same time in the > semiconductor, creating light. This is highly simplified, but essentially the LED is now capacitively coupled to its supply on one side, _through_ the insulator*. As a result it has to be driven with AC. Neat trick. ~~~ qbrass That's similar to how electroluminescent displays work. Except those use an insulating layer and electrode layer on both sides of the light producing layer instead of just one. ------ blauditore The standard showcase for such things are usually transparent displays. While looking cool, they would be pretty annoying: There's always interference with the background. That's also why no one uses transparent whiteboards as seen in movies. But of course, it would also be neat to just have displays on surfaces where not visible when turned off. ~~~ jasonkostempski On-screen keyboards are pretty annoying, that didn't stop them from becoming the only option available. ~~~ lowtolerance They’re the only option available because tiny mobile keyboards were pretty damned annoying, too, while also carrying the cost of dozens of potential points of failure, taking up a considerable amount of real estate regardless of how long the keyboard is actually used, and forcing a particular key layout on the user. I honestly don’t think I’d buy a phone with a hard keyboard, even if there were decent options for one. ~~~ Retric Slide out keyboards are an option. Combine with an on screen keyboard for the other layout and the only cost is a slightly thicker device. ~~~ gregmac I had a Blackberry Torch [1] for a while in the early 2010's. The keyboard was great, and you could use a touchscreen keyboard as well (it would seamlessly switch the app to fullscreen if you slide it open). It was thicker and heavier than most comparable phones, but with modern tech that would be less of an issue. The biggest problem with it at the time was it was running Blackberry OS, which was a reasonable e-mail client, mediocre (and later, noticeably outdated) browser, and abysmal app store and selection of 3rd party apps. I remember saying it would have been a great phone if they had dropped their own OS and went to Android (and spent their software dev time making their version of Android better and building their e-mail client etc on top). My next phone was an Android something or other, and by then, I was more than happy to give up the physical keyboard to get something more useful than an e-mail interface brick in my pocket. [1] [https://en.wikipedia.org/wiki/BlackBerry_Torch_9800](https://en.wikipedia.org/wiki/BlackBerry_Torch_9800) ------ panic Some more pictures from the article itself: [https://www.nature.com/articles/s41467-018-03218-8/figures/4](https://www.nature.com/articles/s41467-018-03218-8/figures/4) ------ patrickcteng Float-in-the-air 3D displays/images are in the labs already! [https://www.nature.com/articles/d41586-018-01125-y](https://www.nature.com/articles/d41586-018-01125-y) ------ meuk If you can layer enough of such invisible screens, you could make a nice 3d display, although I'm not sure if it would actually look nice (you could probably see through objects). ~~~ lawlessone WE could build a real Voxatron! ------ unwind This did not inspire confidence: _Commercial LEDs consist of a semiconductor material that is electrically injected with positive and negative charges, which produce light when they meet. Typically, two contact points are used in a semiconductor-based light emitting device; one for injecting negatively charged particles and one injecting positively charged particles._ Isn't that a bit too simplified? That said it sounds like very cool technology although hard to see how electrical wires to each pixel would be as invisible at wall-sized scales. ~~~ userbinator _although hard to see how electrical wires to each pixel would be as invisible at wall-sized scales._ [https://en.wikipedia.org/wiki/Indium_tin_oxide](https://en.wikipedia.org/wiki/Indium_tin_oxide) ~~~ kurthr ITO isn't really transparent... it's pretty yellow (80-95% transmission) and it doesn't carry current well (20-100ohm/sq). You'd be better off with a metal mesh with large pitch electrodes. You're going to need some wires to address the display as well (whether active or passive matrix). With a large pitch Cu or Al conductors you can do 5ohm/sq with good transparency. That will let you do 200nits (candela/m2) with the efficiency of these thin LEDs. ITO would be 10x lower conductance and only works in LCDs because they are inverting voltage light valves (not light sources). If everything worked out you could reasonably view thin LEDs in a dim office environment. That's not really interesting unless it's 3D. ------ zingmars Can't wait to wake up with ads on my windows... ~~~ oneeyedpigeon That's been happening for a while now: [https://www.theverge.com/2017/3/17/14956540/microsoft- window...](https://www.theverge.com/2017/3/17/14956540/microsoft- windows-10-ads-taskbar-file-explorer) ~~~ mstolpm I don't think the parent poster you replied to thought of -Microsoft- windows, but the ones in your home. ~~~ oneeyedpigeon It was a knowing play-on-words; forgive the triviality, but it was too perfect an opportunity ;) ------ omarforgotpwd Well what’s the point of the display if you can’t see it? ;) ~~~ ben_bai Google glass without mirrors? ~~~ swiley I think it still needs optics so your eyes can focus on it correctly. ------ silveira [https://hackaday.com/2013/12/31/case-modder-builds-lcd- windo...](https://hackaday.com/2013/12/31/case-modder-builds-lcd-window- causes-lsd-flashbacks/) ------ maxyme Isn't heat a current major concern with LEDs? If so wouldn't this have to be hooked up to a heatsink in practice making the transparency irrelevant? ~~~ pjc50 Heat is entirely dependent on brightness. It's only really an issue when using LEDs for illumination at multi-watt levels. Tiny thin LEDs are necessarily not going to have a huge power output. If the substrate is glass it will have decent thermal conductivity. ------ amelius Can it show black pixels? What is the contrast ratio? ~~~ comboy This is just a single LED. And you are asking about specs of some future product that may use it. I think you can make background dark (from transparent) by using good old LCD technology. Because that's basically what LCD is about. ------ drieddust Once it becomes cheap enough, this might give a huge boost to indoor farming. ~~~ nkozyra Not being familiar with the area, what problem would this solve for indoor farming? ~~~ jfindley At a guess, being able to easily augment sunlight in parts of the world that get few hours of sunlight during local winter. Sun shines through when it's there, LED takes over when not. However in practice I don't think this would be useful - they're considerably less efficient than existing LEDs (currently 1% vs 25-30%) and I think that efficiency is more of a problem than transparency. ~~~ trisimix They aren't diodes either ------ Armisael16 This isn’t an LED (in fact it isn’t a diode at all). Can we rename the submission title? ~~~ Raphmedia Looks like they meant LED as in _l_ight _e_mitting _d_evice. ~~~ Armisael16 That simply isn't what the acronym LED means. It's equivalent to using WMD to mean 'weapons of mass disruption'. ------ baybal2 Looks like they figured out how to make LEDs with atomic layer deposition ------ billysielu Apple will love this, a screen that breaks even easier. ~~~ diggan Seems to be bendable so maybe breaking it is harder than current screens. > “The materials are so thin and flexible that the device can be made > transparent and can conform to curved surfaces,” said Der-Hsien Lien, a > postdoctoral fellow at UC Berkeley and a co-first author ~~~ SuddsMcDuff I expect once the material has conformed to a curved surface it would have to be rigid from there on out. ~~~ pjc50 Maybe, maybe not. Flat flexible circuits are established technology. LG already have a flexible display, although apparently it has a dead pixel problem. [http://www.bbc.co.uk/news/technology-35230043](http://www.bbc.co.uk/news/technology-35230043) ------ mojomark Transparent OLEDs (TOLEDs), Transparent Quantum Dot LEDs (QD-LEDs), Transparent Inorganic LEDs, have all been around for a long time now: [https://www.youtube.com/watch?v=IJQzPrkkH6A](https://www.youtube.com/watch?v=IJQzPrkkH6A) [https://www.youtube.com/watch?v=Lv46YU-X9Xs](https://www.youtube.com/watch?v=Lv46YU-X9Xs) I'm confuzzled by the 'news', although the AC drive, vice DC is interesting I suppose. ~~~ ricardobeat Feature wise they sound similar, but we’re talking about three atoms thick here. This could be world-changing tech if it gets efficient enough.
{ "pile_set_name": "HackerNews" }
C vs GO - akg http://crypto.stanford.edu/~blynn/c2go/index.html ====== copx I like Go but it's not a replacement or even just competition for C. Garbage collection alone ensures that. C is for low-level code, where you usually want deterministic, real-time behavior. Go cannot deliver that because of its GC. You can certainly write web server software in Go (what Go was designed for), but could you write an "AAA" video game in it? Or a mission critical embedded system with real-time requirements? Also, the author's portrayal of C is misleading. If you have 'naked' malloc() calls all over your code you are doing something wrong. Much C code never calls malloc() or follows the "no allocation after initialization" principle. If you write C like Python you will run into problems. In C you do not wildly allocate objects on the heap all across the program. E.g. in my current program almost everything comes out of memory pools. E.g. Object* object = ObjectNew(); ..and if I forgot to release it I would know quickly because any "memory leak" would cause the (pre-allocated, fixed size) memory pool to run out of free slots. In C you should manage memory in a systematic fashion. I think the C standard library should be seen as the very foundation of a program, something you use to build the abstractions which solve the problem, not something you use directly to solve the problem. People often warn about the dangers of strcpy() (an the author uses it). I never use it except at the lowest levels. At the high level robust C code looks more like this: ObjectSetName(object, newName); than this strcpy(object->name, newName); ..the difference is that ObjectSetName() can internally guard against overflow and that the concrete details of the object data structure remain hidden and thus later code changes are easier. It is a very common C idiom to use incomplete types and such functions to achieve a very high level of encapsulation. ~~~ luriel Your C code looks very unidiomatic to me, but I guess that is relatively a matter of taste (still I don't understand why would somebody use C if they don't like to write code in the 'classic' C style as seen in the original Unix and Plan 9 source). That aside, in Go you can also do fixed allocations and manage your own memory pools to avoid the GC. And I would also question how many C programs require real-time behaviour (even the definition of 'real-time' has issues, most things people call 'real- time' aren't). ~~~ nux6 E.g. any app with [record] button need to meet 'mostly' predicted system latency, that's 'soft-real-time'. Any professional app with [play] and [record] button needs a minimal warranted latency, that's 'hard-real-time'. Go is unsuitable for both. ~~~ pjmlp And yet there are sound manipulation tools developed in Java and .NET... ~~~ pcwalton Well, to be fair, Java and .NET have generational, concurrent, incremental garbage collectors. ------ fingerprinter I guess I might be the only one to say this, but is this a joke? Some sort of prank? Not a single Go program is more readable, and I would argue that they are ALL less readable (and I like Go). I honestly can't tell if he is trolling or if he is serious. ~~~ jbooth I think Go's strengths over C really start to shine when you're writing programs longer than 100 lines. Not having an exception mechanism, interfaces, single namespace, no attaching methods to structs etc are fine in a small program, but they make bigger programs harder to digest pretty quickly. ~~~ eswangren Well, the operating system you used to write that comment was probably written in C, so apparently it does in fact work in large projects. It has weaknesses to be sure, but I don't think lack of OO (I.e., structs with functions) is one of them. ~~~ dllthomas Particularly given that you _can_ write OO code in C, and _can_ decorate structs with functions, it's just messier. ~~~ StephenFalken "Object oriented programming with ANSI-C": <https://ritdml.rit.edu/handle/1850/8544> ------ simias Damn. I can't comment on the Go listings, but could he make his C code any _less_ readable? What's the point of making the code so dense anyway? Without syntax highlighting I gave up pretty quickly. ~~~ adobriyan The point is to show that Go is superior to C (which obviously is true but not for the reasons mentioned in the article). Anything counts. Go code is obviously formatted with gofmt. C code is obviously not formatted with "indent -kr". Liberal use of comma operator unseen in real world. Look at example of yes(1) with error-handling. Author doesn't use strdup(3). But even if he did, it still doesn't make sense to call fprintf(3) once due to allegedly hard error handling. Error handling is not common path, fprintf(3) doesn't fail! He concatenates all string into once which makes it O(n^2) because of many many times strlen(3) is called (and kernel copied argv contents before that!). I bet he did error checking wrong on C side (just checking return value should not be enough). It'd be interesting to see same code on Go side. Obligatory quote: "considering that bad code can be written in any language, any language comparison performed using examples must be judged by the quality of the examples in each language. it can't be all that hard to write a bad example in language A and a good example in language B, and then proclaim language B to be the winner -- this is how people compare languages all the time, so either those who read them are bad programmers in any language (or are not programmers at all) and don't know how to reject the bad examples, or they already agree with the author of the comparison that language B is better than language A. in either case, it's a waste of anything but marketing money. ... #\Erik" ~~~ shadowmint This has been discussed before, so many times. In fact, in pretty much every thread go turns up in. _sigh_ I'm just going to link to my favourite review now, again (pertinent, and mentioned here only because the author of the original shared a favourite with me): <http://www.math.bas.bg/bantchev/misc/on-go.html> Quote: "But I know I am not going to love Go. True beauty evades this language. Go may be practical, but is also eclectic, and has taken some unconvincing or downright ugly design choices. It definitely lacks that subtle but unmistakable touch of elegance that makes a language great." ~~~ luriel I'm not sure how to take that review given that his two main complaints seem to be: 1) That Go has i++ and not ++i. Don't even know what to say to someone who thinks this a major issue, really. 2) That the declaration syntax is 'unattractive' and different from other languages. Yes, the syntax looks a bit strange at first if you are used to C, but it is unquestionably cleaner and better, specially in more complex declarations. This is even covered in the FAQ: <http://golang.org/doc/go_faq.html#declarations_backwards> His other complaints seem to be about the name of the language and how much some of the Go documentation acknowledges the influences of certain languages, which as he himself says, is just politics and not really relevant to the language itself. At the same time there seems to be plenty of people who actually have used Go and love it, including the designers of other languages: <http://go-lang.cat-v.org/quotes> ------ ralph I gave up reading this very early on. The striving for compactness of the source, in both C and Go, makes it misleading to read. Take if (argc < 2) puts("y"); else { for(int i = 1; i < argc; i++) { if (i > 1) putchar(' '); printf("%s", argv[i]); } putchar('\n'); } A casual skim sees the _if_ followed by an indented block, but that isn't the then-path but the else-path. Now yes, I can read it carefully and follow it, just as if I was debugging someone else's poorly formatted code, but in doing so my attention is being distracted from the main point of the article and frankly I have a huge pile of other things to read that are potentially more rewarding. ~~~ JackdawX Your post adds nothing useful to the discussion. There is no point arguing about indenting style in a 7 line program, it's needless pedantry! This is almost exactly the same thing as grammar nazi-ism, and seems to have a similarly negative impact coding related websites. I think we need to coin a new term - indent-nazi, style-nazi, or something similar - for the purpose of dismissing this kind of post and keeping people on topic. ~~~ swdunlop Actually, "indent-nazism" is a big thing in the Go community. The Gofmt utility, which takes an AST of your code and normalizes it to a common style. Nobody agrees that the style is perfect, a lot of people have religious preferences when it comes to brace placement. But Gofmt ensures that all these people can find a consistent format when it comes time to diff. GoSublime and go.vim both integrate gofmt into the editor; you start to miss it when refactoring, because you can just shrug and say "gofmt will clean it up when I save" when you move a block to a different function or indent level. I agree with the grandparent -- seeing non-gofmt code is jarring and deliberately distracting. It's like someone writing an entire Python program with nothing but lambdas. ~~~ drivebyacct2 It's not just diffs, you learn to scan code. The same way that you can't use i++ inline, having consistent indentation and treatment of syntax allows for quicker at-a-glance human parsing. ------ gouranga Interesting comparisons. However (not slating Go, which I think is excellent), I genuinely think Go is going to go the same way as plan9 eventually. Unfortunately, its predecessor (C) is good enough, much as UNIX was good enough compared to plan9. ~~~ nux6 The devil is in details. Go is great for hi-level or simple stuff like that, but is not really usable for system programming. If things are a bit more complicated, the go runtime stays in the way. For those problems, after a while, the bare metal support was removed from the language completely... ~~~ pjmlp Ever heard of Oberon, Modula-2 and Modula-3? These languages also had OS fully implemented on them, just with tiny bits of Assembly to do the very low level stuff, similar to what C would require anyway. The only advantage of C is that its runtime is so small, that in practice the OS is the C's runtime. ~~~ mkup It's not the only advantage. Another, more important advantage of C and C++ over Go is a lack of stop-the-world garbage collection (and predictable memory usage under high load). ~~~ rwmj You can predict the time that malloc(3) and free(3) take to run? You know they are backed by very complex data structures that need to manipulated and iterated over? The point you mention is only valid when using entirely static memory, which is a very rare case in real C code (only really used in a few small embedded codebases). ~~~ pjmlp > You can predict the time that malloc(3) and free(3) take to run? Thanks for pointing this out. This is often a misunderstanding from manual memory management fans. In many cases malloc()/free() also behave non-deterministic. This is the main reason why for special types of applications, you need to have malloc()/free() implementations specially targeted to the use case at hand. This is no different from the GC runtimes, which are coded specifically for real time applications, like avionics, for example. ~~~ ori_b It's very different in one significant regard: it's very easy to write your own acceptably performant and deterministic malloc and free, compared to the effort it takes to write a GC that would fit in to those constraints. ------ delinka My complaint about Go centers around its mechanics of code reuse: static linking for all Go code; making anything Really Useful requires using other libraries, which includes those written in C, which require use of a tool to help you write your wrapper ... If we could get dynamic linking and something less cumbersome for interfacing with existing libs, I could use Go for Serious Work. ~~~ shadowmint I dont see why this got downvoted. It's the biggest issue I have with go as well. It's a royal pain writing go that talks to C code, compared to say, lua or python, and there just _isnt_ a way to make other languages pickup go libraries and run the symbols from them afaik... ~~~ luriel > It's a royal pain writing go that talks to C code, compared to say, lua or > python This is plain wrong, you can pretty much call C code directly, while in Python and the like you _really_ have to write a wrapper. Of course in Go you will write a wrapper anyway to give the library a more Go- like API, but I don't see how this could be any worse than in any other language. > there just _isnt_ a way to make other languages pickup go libraries and run > the symbols from them afaik... To get non-Go code to call Go code is trickier (also Go really wants to run your main()), but can be done, and there are even libraries to write Python extensions in Go, see: <http://gopy.qur.me/extensions/> ~~~ xyproto Also, there's assembly. <http://stackoverflow.com/a/6535590/131264> ------ Graphon People suggesting that Go is a potential replacement for Python, Lua or Ruby are missing the point. IMO, Go isn't designed to compete with those existing languages for existing opportunities. The key opportunity in the future is smart devices everywhere. Embedded, connected intellgence, everywhere. Everything is a communication device. Today your phone and your car; tomorrow: Your shoes, your office, the grocery store, your refrigerator. Think of xbox Kinect-type sensors embedded into everything. Writing solid C code for all those systems will be too hard. We also definitely do not want an serendipitously-designed language (Javascript). Yes, that leaves Python Ruby and so on, which brings us full circle. Go will compete with those languages but not in the domains that are evident today. Not in web browser, and not in a new! improved! web server. It seems to me that Go is a forward-looking design, aimed to meet the challenges of the everything-connected world of tomorrow. To make tomorrow happen, we need a better C. Go is that. ~~~ heretohelp Go is highly inappropriate in embedded environments so your pipedream betrays a naivete to systems programming. ~~~ xyproto * You can turn off the GC. * You can compile with gccgo. * You can call assembly code from Go. What is the major hindrance from using Go in embedded environments? ------ revelation I can't do system programming using Go on a platform for which there is no compiler. There is probably a reasonable C compiler for every platform in existence out there. ~~~ Devilboy Right now there's Go compilers for Windows, Mac and Linux. I'm sure others will follow soon. ~~~ dchest \+ FreeBSD. Also, OpenBSD, NetBSD, and Plan 9 in the works. ~~~ luriel OpenBSD and Plan 9 pretty much work already, even if the ports might not be as polished as FreeBSD. Also there is gccgo, which AFAIK also works on Solaris and probably elsewhere. ~~~ 4ad Yes, it works on IRIX and RTEMS (an embedded OS), and the gc suite was also ported on Ethos. I'd also like to add that the gc compilers support ARM as well as x86 and amd64 and are very, very easy to port to new platforms. ------ kristianp Mods, the title should be "C to Go", Go is not spelled with all-caps. ------ iamgopal err..sarcasm ? ------ papsosouid Go is the best worst language I've used. Most of what I do fits pretty much right in the sweet spot for go, network services and web development. Stuff I would have previously used C and (insert scripting language here) for respectively. Go fits into both of those areas really nicely, and I prefer it over C and scripting language X for these tasks. But the problem is, I've already tried haskell, which also fits that same area, and is semantically a vastly better language. I just wish go had been more willing to push the envelope and at least try to be somewhat modern and useful instead of being "C with modules, but slow". On the other hand, haskell is a terrible language syntactically, and from a development environment perspective. Go is near enough to perfect in those regards. Having a fast compiler, a simple, working build system, a sane and easily enforced code format all make a huge practical difference. I wish I could use go, as it is much nicer than haskell from a usability perspective. But the language is just too primitive. ~~~ eeperson Might I suggest Scala[1]? It is a modern languages with mature build tools and development environment. It also has a lot of nice tools surrounding network services and web development[2]. [1] <http://www.scala-lang.org/> [2] <http://typesafe.com/stack> ~~~ papsosouid You certainly can suggest it, I already tried it though. ;) Scala ends up more like the worst aspects of haskell and go combined, rather than the best aspects of both. ~~~ soc88 Details, please. ~~~ papsosouid Pretty much what I described above. The bad parts of haskell (compile time, memory usage while compiling, bad syntax, terrible dependency/package/module management) are all present in scala, and it is semantically an inferior language to haskell. Scala is certainly a whole lot closer to haskell than go is, and I certainly don't blame anyone involved in the creation of scala for its limitations, the JVM limits how good they can get. But at the end of the day, using scala would solve none of the problems I have with haskell, and leave me with a less powerful language. ~~~ eeperson Out of curiosity, what make the Go dependency management so good? I'm not terribly familiar with this aspect of Go. Based on a what I've read, it doesn't seem dramatically different than what you would do for Scala (except it is packaged with the language rather than an external tool like SBT). Is there something I am missing? ~~~ papsosouid SBT just fails to work correctly on a semi-regular basis. Quite a few scala people ended up resorting to maven instead, which says a lot. ~~~ eeperson That has not been my experience at all. SBT has always worked correctly for me. When was the last time you used Scala and/or SBT? I know a lot of the tools in the Scala ecosystem were unstable prior to Scala 2.8. However, now they have become pretty good. Even the, previously notoriously bad, Eclipse plugin has become good. ~~~ papsosouid ~6 months ago. I'd certainly say the eclipse plugin has improved a lot, it's better than the intellij one now. It is still quite a ways away from good though. ------ drivebyacct2 I've been using Go exclusively in my personal projects for the last 8 months now, am in love with the simplicitly and fun of writing it (and goroutines), but this is a horrid way of introducing the language. To anyone who doesn't appreciate the Go syntax style, this is an instant turn off. There is a reason there is an idiomatic style used by... every single Go project I've ever seen. Further, these examples are so trivial that one doesn't see an advantage over C and so this comment thread is like every other. Those who've written "Hello World" dismiss it as neither C nor Haskell and most others seem to be generally happy with it. ------ tubbo I mean, the only other living inventor of C is working on Go... ------ millerfung In my opinion, any new language today should be all beginner friendly, there are large pool of people who are interested and it means a great future of our world. ~~~ EliRivers "any new language today should be all beginner friendly" I disagree. I think any new language should be designed to best meet the kind of problem it's being created to deal with. Making it "beginner friendly" (whatever that means - I bet there are lots of different interpretations) is a nice extra, but should not come at the expense of solving the problems. ~~~ millerfung Yes I agree that new language should be developed because of the ever rapidly changing digital world to enhance everyones experience in digital products. I am sure it is going to get more complicated, however, being a beginner friendly language should still be taken into account. In fact, only the one who is more beginner friendly will survive in the long run because in the future there might be a mismatch of demand and supply of coders. Living in this moment of our planet is exciting because of the experience we have having digital products around us everyday, and there are/will be lots of teenagers who wants to get involve as well. In my opinion, coders are like "workers" in manufacturing companies in the future. ~~~ EliRivers "In fact, only the one who is more beginner friendly will survive in the long run" That makes no sense to me. In any industry with a sizeable number of workers, there is a huge range of tools, from the entry level easy-to-use up to the fantastically intricate and arcane. Tools that are not "beginner friendly" are that way for a reason. As an example, assembly code is never going away; for the obvious reason that if nobody understands it, nobody can write compilers, and also because there are cases where knowing assembly is useful and helps to make better code, whether it's taking apart the code to really, truly get every last clock-cycle of power out of the thing, or to take apart code in the search for arcane bugs and wonderfully subtle interactions causing unexpected behaviour. Assembly is old, and will never go away, and is (for most meanings of the phrase) not beginner-friendly. The only advantage to a language being "beginner friendly" is that beginners can learn it fast. What you then get are inexperienced programmers who know just enough to be dangerous (this is not an attack on them; it's the case in any industry with a low barrier to entry, and a stepping stone to being better). One expert, experienced programmer with knowledge of a "beginner unfriendly" language is worth literally dozens of first-day coders wielding their hand-holding, garbage-collecting, counting-begins-at-one modern version of BASIC. That is never going to change, and every first-day coder wants to become that expert. Even if somehow all the non-"beginner friendly" languages died, the very next day someone who'd been coding in this "beginner friendly" language for a decade would finally get sick of it and start designing a language she can truly express herself in without having all the hand-holding that holds her back. ~~~ millerfung That's a very good argument, I will accept this :) hopefully, at the end, there is a language that is easy to understand at first and therefore people could easily pick it up, like English?
{ "pile_set_name": "HackerNews" }
Monaco-React: Monaco Editor for React Wrapper for Integration Without Webpack - praveenscience https://monaco-react.surenatoyan.com/ ====== surenat Yes, @ingo_ :) It supports 56 languages; Python as well. Also, you can write your own language, with custom formatter and with custom logic of tokenizing. ------ ingo_ Does it support Python?
{ "pile_set_name": "HackerNews" }
Debugging Lisp Part 1: Recompilation - raphaelss http://malisper.me/2015/07/07/debugging-lisp-part-1-recompilation/ ====== pavelludiq On more than one occation when writing web code I found myself in the debugger in the middle of a request handler. The browser just sits there waiting for a response. I found the bug and restarted the handler before the browser time- outed. The browser recieved the response as if a bug never existed. Of course this only works if you can find and fix a bug in under 120 seconds :) ------ bainsfather As a non-lisp user, this {easy-debug -> fix -> resume program} appears to be extremely useful, and something that only lisp offers. e.g. wading through clojure stack traces seems very primitive in comparison. How important is it in practice? Can/could other languages do this? As an outsider looking at new (to me) languages, it's hard to judge what the pros and cons of a language are - especially the cons - e.g. several years ago, when looking at clojure, it took a lot of searching before I concluded that clojure did _not_ have this feature (maybe that has changed now?). Meanwhile the 'learn language x' books always seem to focus only on syntax, rather than workflow. ~~~ yeureka Edit and Continue in Visual Studio is similar to that. You can change the source code while the program is paused in the debugger, for instance in a breakpoint, and continue after your modifications. ~~~ bainsfather Interesting, thanks. Do you get the ability to move the execution point back to _before_ the the error, in order to resume execution? - from what I can tell, this isn't possible - from [https://msdn.microsoft.com/library/y740d9d3.aspx#BKMK_Set_th...](https://msdn.microsoft.com/library/y740d9d3.aspx#BKMK_Set_the_next_statement_to_execute) "Setting the next statement causes the program counter to jump directly to the new location. Use this command with caution:" "If you move the execution point backwards, intervening instructions are not undone. " ~~~ yeureka Not sure, I have never tried moving the instruction pointer after Edit & Continue. Moving the instruction pointer backwards has been fine in my experience if you choose a suitable line to reset your state.
{ "pile_set_name": "HackerNews" }
Half of Dr. Oz’s medical advice is baseless or wrong, study says - mudil http://www.washingtonpost.com/news/morning-mix/wp/2014/12/19/half-of-dr-ozs-medical-advice-is-baseless-or-wrong-study-says/ ====== CapitalistCartr I wonder how much of most doctors' advice is equally wrong, outdated, etc. And how many studies are flawed; the studies that are the basis for what is right. ~~~ capkutay Dr Oz's incentives are different. He's a day-time television talk show pundit. Fad diets that inspire false hope and supporting big time health/vitamin company brands directly translate into him making tons of money. I don't think most doctors are compensated for this type of behavior. ~~~ dmix True, this is similar to the classic question as to why does a top-of-their- game baseball players make millions of dollars while top-of-their- field-<doctors/engineers/etc> don't make a similar amount. The reason is because their skill set reaches hundreds of thousands of people while a doctor/engineer can only reach maybe a thousand. The rationalization is they can reach a massive amount of people and their effect on the world is much broader (regardless of having a higher impact). I honest have never seen a Dr Oz show but I know he has a broad audience therefore he should have a much much higher level of scrutiny than any other doctor. Regardless the OP's question is still valid, if only the goal is to improve upon the output of the science. ------ dewey John Oliver did a segment on just that a few months ago, it's worth watching: [https://www.youtube.com/watch?v=WA0wKeokWUU](https://www.youtube.com/watch?v=WA0wKeokWUU) ------ jlrubin I'm just surprised half is correct. ~~~ dangerlibrary Most of his recommendations are probably binary (eat red meat / don't eat red meat). Given that, 50% is what you'd expect from a random medical advice generator. ~~~ jlrubin I agree, hence my comment -- I find that surprising given the bias to selling snake oil. ------ coding4all Duh. I have parents that've been watching his show for years and I can almost guarantee they're not anywhere close to where they need to be in terms of health goals. People who watch this show are the same people who drink diet soda and eat sugar-free cookies, brownies, and let's not forget, low sodium bacon. In other words, people who don't understand that those things aren't bad for you in moderation... Nothing says fucked like eating a crate of gluten-free cookies. ------ aurora72 There's one great American doctor, David Perlmutter whose advices are remarkable and in his book Brain Grain it's stated that Dr. Perlmutter serves on the Medical Advisory Board for The Dr. Oz Show and that he appeared on those shows several times. That's an important reference because Perlmutter is one of my most favourite doctors. I personally don't watch Oz's TV programs often, but in Washington Post's article, it's underlined that "OZ's advices are not backed up by medical science" But wait, is the medical science itself 100% reliable? Oz at least presents the medicine in an entertaining manner and many of his viewers take his advices with that in mind, for instance me I take his shows lightly just as I take the unshowy doctors advices lightly. Just one video by an undoctor called "Lemon Juice & Apple Cider Vinegar - YouTube" is more beneficial than "medical science" to me and the quick scientific proof included. ------ ChuckMcM Is this surprising? There is lots more money to be made if you push snake oil for the snake oil manufacturers. And hey the placebo effect will give you some 'success' either way right? ------ whistlerbrk We live in a world where doctors get kick backs from pharmaceutical companies for prescribing their drugs, what did we expect? Professional organizations and licensing boards need to stand up for the integrity of their professions. Revoke his medical license and start to clean up this system. ~~~ k-mcgrady >> We live in a world... You mean, "We live in a country..." ~~~ whistlerbrk I thought about that, but I doubt this only happens in the US with multi- nationals at the helm.
{ "pile_set_name": "HackerNews" }
Clojure spec – predictive specifications of data and functions - joubert https://vimeo.com/195711510 ====== based2 [https://github.com/Mamun/spec-model](https://github.com/Mamun/spec-model) ------ based2 [https://clojure.org/guides/spec](https://clojure.org/guides/spec) [https://en.wikipedia.org/wiki/Design_by_contract](https://en.wikipedia.org/wiki/Design_by_contract) ------ based2 [http://www.metosin.fi/blog/schema-spec-web- devs/](http://www.metosin.fi/blog/schema-spec-web-devs/)
{ "pile_set_name": "HackerNews" }
A beauty contest winner making Japan look at itself - RobAley http://www.bbc.co.uk/news/world-asia-32957610 ====== hudstew There are a few things to keep in mind: 1) Ariana Miyamoto was chosen as Miss Japan by Japanese people. 2) The critics of Miyamoto are an extreme, vocal minority on Twitter and 2chan. 3) Although Japanese contestants often perform well at the international level, beauty pageants of this sort do not receive much attention within the country regardless of the contestant. 4) "Hafu" is a Japanese word derived from the English "half". It is used to refer to biracial people, but there is no connotation that they are "half" a person. I cannot understand the logic of the author who says that the word sounds derogatory in English -- it's a Japanese word, after all, and is not derogatory in Japanese. The author uses point (2) as well as his short personal experience in Japan to make broad statements to the effect that Japanese people in general do not accept those whose appearance differs from their own. Japan makes an easy target in this regard because of the well-known (outside Japan, at least) concept of Nihonjinron, as well as the persisting sense that Japan is an isolated country where outsiders will never be accepted into the "in group". Due to this, there is an expectation (particularly in foreign media) that Japanese are racist, and any proof, even that of an extreme minority on Twitter, is taken as evidence in support of this. Of course, living in Japan, I remember hearing similar broad statements about racism in the United States concerning the backlash against electing a black president in 2008. The real lesson here is that we should not be basing our impressions of an entire society on a vocal minority, no matter how much their opinions conform to our expectations. ~~~ kstenerud Yes, let's forget about all the nightclubs I couldn't go into, being physically barred by guards outside holding their crossed arms, saying "gaijin dame!". Let's forget about the complete lack of upward mobility in any office I worked in. Let's forget about the neighbors who completely shunned me until it was my turn as the block representative, at which point they criticized my lack of understanding of the bureaucracy. Let's forget about the countless times people would cross to the other side of the street as soon as they saw me coming. I just thanked my lucky stars that I wasn't Brazilian. They have it the worst. Now, I had lots of nice experiences there, especially in the countryside where people are a lot friendlier, but by no means can you say that racism does not permeate their entire society. ~~~ Nadya Strange how much our experiences differ. People in the countryside were far more 'racist' to me. I went to parts of the country where they rarely, if ever, see an 'outsider'. I would be stared at by wide-eyed children and shocked elders alike. I would be pointed at and _noticed_ because I was different. In the city, I experienced none of this. I put 'racist' in quotes because, although I felt like an attraction, it wasn't a bad or biased opinion of me. It was simply "That's a rare sight! Look over there, a foreigner!" "Gaijin" don't tend to know social norms. That's a fact. Some do, most don't. It's easier to ban them all then have to kick out 95% of them. They cause problems because of said lack of knowledge of social norms. Why deal with problems? It might lose you customers. It's much easier to ban any and all foreigners. Same reason you can't go to many bathhouses if you have a tattoo. Having a tattoo is seen as being part of the Yakuza and Yakuza are nothing but trouble. So instead of kicking out Yakuza, they don't allow people with tattoos. Are you Yakuza because you have a tattoo? Of course not. It's just easier to ban tattoos preemptively to prevent problems. ~~~ kstenerud Yeah, my experience was similar in the countryside. But I don't consider being regarded as a curiosity as racism. They had no real preconceived notions as to how I would behave, and so our interactions were always cordial (though it helped that I was familiar with Japanese customs). I'd get invited to the local izakaya and we'd have a grand old time shooting the shit over sports, politics, etc. They'd want to know what I thought about pretty much everything (as would be expected when someone from an area you'd never seen before comes to town). ~~~ Nadya If your Japanese is near-perfect you can tell them to let you enter the "no foreigners" nightclubs and bars claiming you were born and raised in Japan. Just pick a place as your hometown. If it were for racist reasons, they wouldn't let you enter simply because you don't "look Japanese". Not to say the issue doesn't exist at all. Just that it isn't prevalent. If it didn't exist, there wouldn't be this comedy sketch about it: [https://www.youtube.com/watch?v=oLt5qSm9U80](https://www.youtube.com/watch?v=oLt5qSm9U80) Although the scenario I experienced was that the waitress asked my Japanese friend what I wanted to order rather than asking me. Assuming I didn't speak Japanese. At most places I ordered food at - they would ask me what I wanted directly. So I wouldn't say it's a common experience either. Just that it does happen and it's because few foreigners speak Japanese! It's a habitual thing, not a racist thing. It happened more in the touristy parts of Tokyo than anywhere else (even in the touristy parts of Kyoto!) ~~~ un_ >claiming you were born and raised in Japan. Just pick a place as your hometown. I can't say I have not considered this, but it requires "faking" yourself, even for just a minute. It also demands that you have a high degree of Japanese language. I don't think it's entirely unreasonable to expect people who do not look as though they know the language in a virtually homogeneous society to not know the language. I've heard that the people who encounter this "get around" it by demonstrating that they do know at least something. I read a blog post once that language is one of the biggest barriers; seriously dedicating one's time to learning the language is something that many, many non-native speakers simply miss out on - and, in some cases, go onto assume that this response is because of xenophobia or racism. I haven't been in Japan long enough to give my own anecdotal experience, though. ------ jpatokal > _There is no word like hafu outside Japan, but I think we need it here._ Welp, she's Japanese alright, she's even mastered _nihonjinron_ : [https://en.wikipedia.org/wiki/Nihonjinron](https://en.wikipedia.org/wiki/Nihonjinron) But more or less the same concept elsewhere: [https://en.wikipedia.org/wiki/Hapa](https://en.wikipedia.org/wiki/Hapa) (Hawaiian, and another loan from the English "half") [https://en.wikipedia.org/wiki/Luk_khrueng](https://en.wikipedia.org/wiki/Luk_khrueng) (Thai for "half child") [https://en.wikipedia.org/wiki/H%C3%BAnxu%C4%9Br](https://en.wikipedia.org/wiki/H%C3%BAnxu%C4%9Br) [https://en.wikipedia.org/wiki/Mestizo](https://en.wikipedia.org/wiki/Mestizo) ~~~ jboy Here in Sydney, Australia, the term "halvie" seems to be relatively common amongst those who discuss such things. It's not considered derogatory or offensive; at worst, slightly unsubtle or socially inept (like any labelling of a person by their race). Friends use the term lightly or playfully. Australia is a predominantly-white country that is geographically closer to Asia than to any other white or Western countries. We have a long history of Chinese immigration. Sydney, especially inner-city Sydney and the areas around the four major universities, have large proportions of predominantly-Asian international students, many of whom later settle here. White-Asian interracial dating is very common in Sydney. Especially at the universities amongst students in the Science, Engineering (including Computer Science, of course) and Business schools. And so, there are an increasing number of "halvies" being born... ------ jpatokal Finland, another country that's traditionally been very isolationist -- before the EU came along, it was second only to Albania in having the fewest % of resident foreigners -- had its own version of this in 1996, when a Finn of half-Nigerian descent was chosen as Miss Finland: [https://en.wikipedia.org/wiki/Lola_Wallinkoski](https://en.wikipedia.org/wiki/Lola_Wallinkoski) ~~~ carlob Italy has never been very isolationist, but there has been some debate when Denny Méndez [1] won Miss Italia. [1] [https://en.wikipedia.org/wiki/Denny_Méndez](https://en.wikipedia.org/wiki/Denny_Méndez) ------ CurtHagenlocher How is this different from when Miss America 2014 was crowned? ([http://www.thewire.com/entertainment/2013/09/first-indian- am...](http://www.thewire.com/entertainment/2013/09/first-indian-american- miss-america-has-racists-very-very-confused/69439/)) ------ vtlynch This entire article is about stereotyping, and the third paragraph is rife with the authors own stereotypes. "My confusion lasts only until Ariana opens her mouth. Suddenly everything about her shouts out that she is Japanese, from the soft lilting tone of her voice, to her delicate hand gestures and demure expression." What is this trash. ~~~ tghw Stereotypes are shortcuts for us to more easily categorize people. The term "stereotype" has gotten a negative connotation associated with it because often stereotypes are used in a discriminatory manner, but the truth is that we all stereotype to some degree to ease our mental load. In this case, the author is identifying behaviors that are more common among people who were raised in Japan than behaviors you might expect from someone raised in the US. My reading was that, while she is ethnically mixed (to whatever extent that phrase is meaningful), she is culturally Japanese through and through. ------ Nadya This is only just now making BBC news? This took place ages ago. It's not uncommon for vocal minorities (ie. nationalists who take some pride in being "pure" to their race. Not necessarily in a "nazi like" way) to speak up when a "half" wins a contest that is supposed to represent them as a people. If you read through the comments on this thread you'll find several other examples from different nations about this very same problem. How can you say a 50% Japanese person represents a 100% Japanese person? They are not representative of a 'Japanese person' if they are not fully Japanese! That is the logic of the nationalists. Does it sound racist? Yes. Is it racist? To most people, probably... especially to PC-minded liberals. Do I consider it racist? Not really, but it brings a host of its own problems regarding these sorts of contests. How would a ハフ compete in beauty contests? Would there be "African-Japanese" classification? What about people who are a mix of 3 races? 4 races? Do we have to do genetic fingerprinting? Would it be based on a majority of which race you are? Who cares if she was "selected by Japanese people"? A democratic vote does not guarantee 100% of the people agree with the decision. Let's say she won with a 95% approval rating. That means 5% of the people who voted are either unhappy or neutral that she won. If that 5% is vocal of their disapproval... that's to be expected? Her being selected by Japanese people for the contest does not mean there won't be a vocal minority opposed to it. ------ beachstartup _> There is no word like hafu outside Japan,_ uh, what? how about "half", the exact word that 'hafu' is supposed to emulate? you can't borrow a word from another language, and then claim it's unique to yours! i'm fairly confident nearly everyone on earth knows what it means to be 'half', or just mixed race/ethnicity in general. in fact there are even places on earth where mixed race is the norm. try going to brazil. FUN FACT: the largest japanese population outside of japan is there. i think they might have a concept of hafu there. _just maybe_. if you're paying attention, i think the above statement tells us more about japanese culture than the entire article, on many levels. ~~~ edmccard >where mixed race is the norm ... brazil ...the largest japanese population outside of japan is there. i think they might have a concept of hafu there Yes, but I'd bet that the connotations of "hafu" in a society where mixed race is the norm are very different from those where it is unusual. So much so that you might consider it to be a different word, even if it happens to be spelled the same. ~~~ beachstartup being half japanese in brazil is rare, and the brazilians have a word for it. [http://en.wikipedia.org/wiki/Japanese_Brazilian](http://en.wikipedia.org/wiki/Japanese_Brazilian) ------ mhogomchungu Who exactly is a Tanzanian were questions that were also asks also when a Tanzanian of indian descent[1] represented Tanzania in international beauty pageants. [1] [http://en.wikipedia.org/wiki/Richa_Adhia](http://en.wikipedia.org/wiki/Richa_Adhia) ------ addicted well, at least the Japanese dont call their country's beauty pageant winners terrorists. [http://en.wikipedia.org/wiki/Nina_Davuluri#Response_and_sign...](http://en.wikipedia.org/wiki/Nina_Davuluri#Response_and_significance) ------ gii2 Jhene Aiko is my second favorite blackanese now... :) ------ billpg Beauty contests are still a thing? ~~~ Paul_S Same way football matches are still a thing. Just because it doesn't entertain you, doesn't mean it doesn't entertain millions of people who don't share your interests. ~~~ billpg Football matches are still a thing? (I'm sorry.)
{ "pile_set_name": "HackerNews" }
Electronic contact lens displays pixels on the eyes - illdave http://www.newscientist.com/blogs/onepercent/2011/11/electronic-contact-lens-displa.html ====== tallanvor It will be interesting to see if they are able to progressively scale up the resolution while still ensuring that you can actually see out of the lens. As cool as this is, though, I'm more interested in having them figure out more ways of restoring vision rather than augmenting it. For purely selfish reasons, I'm especially interested in retina replacement, which is being worked on (they've been able to create new retinas using embryonic stem cells), but I haven't heard of any studies having started yet to actually swap out damaged retinas with the new ones. ~~~ palish How were your retinas damaged, if I may ask? ~~~ tallanvor Luckily it was only a single retina that was damaged, and it was due to a parasite when I was under a year old, so I've never actually known what it was like to have two good eyes. You can get by with only one eye, but you don't have any real depth perception and you don't get any benefits from 3D movies (although you still get all the downsides). Plus, although I would love to try laser surgery to improve my vision in my remaining "good" eye (if you can call needing -9.5 power lenses good), the risks are still too high given the consequences if something went wrong. ~~~ jessriedel Do doctors think you're brain did enough development when you were an infant that it would handle visual info coming from your bad eye if your retina were replaced? I was under the impression that someone like you (who grew up without using the eye) might not be able to usefully interpret the signals from the eye if it were repaired. ~~~ felipemnoa The brain can adapt to "seeing" images through the tongue [1] so I don't think this would be a problem. The brain is always adapting to new inputs. [1] <http://discovermagazine.com/2003/jun/feattongue> ~~~ jessriedel There are limits to its adaptability, especially for processing-heavy tasks like vision for which the brain has specialized hardware. There is a world of difference between interpreting 144 electrodes to pick out just a couple of _bits_ of information, and seeing. In particular, if the brain were infinitely plastic then you could just wire the output from a HD video camera to some random spot on the brain and wait for the brain to figure it out. ~~~ mietek Have you tried doing that? ------ DanBC I'm still waiting for easily affordable "Virtual Retinal Displays" (using a laser focused on the retina) to be available: (<http://www.hitl.washington.edu/research/vrd/>) ~~~ pavel_lishin I'm waiting for the inevitable news stories that will follow - "Could hackers hijack your VRD to fry your eyes with lasers? Tune in at 11." ------ jasonabelli I want one with facial recognition software. Never have that nervous moment again when you have to do introductions and you are just not quite so sure about your old acquaintances name. ~~~ patrickod That but with nice metadata associated with them. Imagine Rapportive in real life. That would be incredibly useful. ~~~ mike-cardwell I look forward to the day when my enemies leave an unattended Facebook session open, and I can update their status so they have a big "I'm carrying five grand in my wallet right now" sign floating above their heads as they walk around dark alleyways. ~~~ DilipJ I know that was in jest (if not, you may be turning into a supervillian), but the actual implications of having facial recognition built into contacts is immense. It will be a Minority Report type world, where anonymity would be completely gone. I guess it would mean no criminal outlaws or parole violators walking around, but for a common man to not be able to walk anywhere without everyone around him knowing everything there is to know about him....that's scary ~~~ Symmetry It is, but then again that's what most humans have lived their entire lives doing, living in small communities where everyone knew everyone else since childhood. ~~~ DilipJ isn't that why most people leave small towns, to go to a big city and be anonymous. To be able to get away from their past? ------ dfischer Show me the best ratio to winning when playing Blackjack. WIN! ~~~ delinka Geez - this will spawn a whole new area of detection in casino security systems. High resolution cameras capable of seeing the subtle reflective changes in the eyeball, EM detectors at the doors for finding the minuscule circuits on your eyes ... I can also see the casinos' utopia coming to fruition: those without the smarts to count cards, without the expertise to play properly, without all their electronics gadgets, coming to play^W empty their wallets into buckets owned by the casinos. ~~~ pavel_lishin How is that significantly different from the casinos' status quo? ------ inty Interestingly enough, I think a technology like this has to be monitored by professional sports--maybe not now, but I could see this type of technology being the new frontier in "cheating". It seems like the logical next step in performance enhancement. Imagine a display that could read the velocity of a fastball, measure the drop of a sinker, read a quarterbacks heart rate, or judge which receiver is the most open. A lot of players in professional sports already deal with technologically advanced contacts, though these are quite rudimentary and merely change the tint (Yellow and Red) to allow for athletes to see specific details much more clearly. ~~~ JeffL If everyone has access to these, then maybe it just makes the sports more exciting and enjoyable for all the players and fans to have extra information? ------ exDM69 Did anyone else think about the Eyephone episode of Futurama when they saw this? ~~~ rasur I thought of Steve Mann's "eye tap" ~~~ polyfractal Was just about to post the same thing. Pretty nifty device, I've considered building my own for a while now. It's pretty simple. <http://en.wikipedia.org/wiki/EyeTap> ~~~ rasur Yeah, I like the wearable computing idea much more than I like walking about with a "smart" phone. ------ tribeofone As interesting as this is, I don't think embedding all this into a contact lens is t he way to go. This is purely hypothetical but how about something like this <http://www.youtube.com/watch?v=mUdDhWfpqxg> (SixthSense) done with a sort of polarity/light effect that could only be seen though a special contact lens. ------ Egregore It's an interesting development, but unfortunately not tested on humans yet. So we don't know yet the real resolution for human eye. ------ delinka How do you either keep them turned "upright" or detect rotation for updating the images? Contacts slide around in the eye a bit and their shape keeps them on the eye's lens pretty well, but there's no asymmetrical shape for keep the contact lens from rotating. ~~~ cstuder Some lenses, I think mostly the soft ones, are asymmetrical: They are intentionally heavier on the bottom and let gravity rotate them into place. ~~~ gmac Yes -- those are toric lenses, for correcting astigmatism. Unfortunately it makes them rather thicker and less comfortable than the usual ones. ~~~ mietek Funnily enough, I just picked up today my first ever daily disposable lenses for correcting astigmatism. These are the most comfortable lenses I've ever worn, including regular daily disposables. ------ meatsock thread title incorrect, should be 'displays pixel' ------ billybob Wirelessly powering something that sits in my eye? I don't think so. Isn't it about 10,000 times easier and safer to do a HUD using glasses than contacts? ~~~ Cushman At these scales? Probably not. ~~~ dholowiski From the article: "The test lens was powered remotely using a 5-millimetre-long antenna printed on the lens to receive gigahertz-range radio-frequency energy from a transmitter placed ten centimetres from the rabbit's eye. " I'm a pretty rational person and I have no problem holding a cell phone up to my head, but this scares me. ~~~ wcoenen Why? It would probably be a much weaker signal than that of a cellphone (~500 milliwatts). Assuming the tech is similar to short-range RFID, it could run on a transmission power of a few milliwatts. ------ leviathan That sound you hear is the sound thousands of advertisers creaming their pants. Now you will have to look at that ad, even if you close your eyes. ~~~ billybob Not if the software is open source. On the contrary, you could run Adblock and never see another billboard. ~~~ sixtofour Why would you think this could ever be open source. It will be expensive to develop, and no corporation will spend the money to do so and then open source the software. It may (not likely) be jail breakable, but you still have to pay (probably) thousands to get it. I was thinking one way for people to afford them would be to accept ads. I would hate that, and I think I'd just wear glasses or normal contacts. In fact I'd be much more attracted to this sort of thing in glasses, because they'd likely be cheaper, and glasses can probably support more functions. And you wouldn't have a radio receiver concentrating radio energy directly on your eyes. ~~~ jerf Hardware isn't software. I run open source on my hardware, but my hardware isn't open source. If your logic held, open source wouldn't be possible. ~~~ sixtofour Sometimes open source on locked down and jailbroken hardware is better. Sometimes it bricks the hardware. I wonder what it's like to have a brick in the eye? Also this is a thing that goes in your body, and the FDA might have something to say about open source in that context. Is there any open source software running on pacemakers at the moment? I think there are many more issues involved than simply "it's my hardware and I'll do what I want with it." If we get better eyeware through open source then great, but I'm skeptical at the moment that it can, much less will happen. ~~~ jerf "I think there are many more issues involved than simply "it's my hardware and I'll do what I want with it."" Actually, that's kind of my point more than yours. It isn't as simple as a knee-jerk "it must be closed source". You argued black and I argued not-black; not-black isn't "white", it's white and the greys, too. ------ sireat First thing that came to my mind, was Neuromancer where the implanted mirrorshades in Molly's eyes showed time. ------ signa11 this: <http://www.wired.com/wired/archive/10.09/vision_pr.html> is just too good to not mention here ------ JoeAltmaier Automatic photogrey eyes! Telescopic vision! Zoom!
{ "pile_set_name": "HackerNews" }
Time modulation-based propogation of sound waves on topological metamaterials - bookofjoe https://phys.org/news/2020-07-scientists-major-breakthrough.html ====== peter_d_sherman >"The researchers designed a device made of an array of circular piezoelectric resonators arranged in repeating hexagons, like a honeycomb lattice, and bonded to a thin disk of polylactic acid. They then connected this to external circuits, which provide a time-modulated signal that breaks time-reversal symmetry." [...] >"In a breakthrough for physics and engineering, researchers from the Photonics Initiative at the Advanced Science Research Center at The Graduate Center, CUNY (CUNY ASRC) and from Georgia Tech have presented the first demonstration of topological order based on time modulations. This advancement allows the researchers to _propagate sound waves along the boundaries of topological metamaterials without the risk of waves traveling backwards_ or being thwarted by material defects." My interpretation of this is that these researchers may have invented the analogue of an electrical diode -- but for sound, that is, a _sound diode_. This _sound diode_ then, may pave the way to a greater understanding of diodes (both electrical and sound) in general, and possibly pave the way towards a future _sound transistor_ , that is, a transistor where a little bit of sound (as pitch, analogue of electrical voltage) acts as either a switch or a regulator for a whole spectrum of sound (analogue of electrical current). Of course, all of that is highly speculative, and I'll settle for a _sound diode_ \-- for now! I applaud the work of these researchers!
{ "pile_set_name": "HackerNews" }
Big fish are found deep not because of age, climate, or prey, but because of us - curtis https://arstechnica.com/science/2018/06/ecological-law-turns-out-to-just-be-the-result-of-us-fishing/ ====== southern_cross I read a story once where ongoing changes in the timing of salmon runs were (of course) being blamed on climate change by scientists. But then someone who is actually in the fishing industry pointed out that for ages now we've been artificially selecting out those salmon which run during the "standard window", leaving in their wake mostly survivors who show up a little bit sooner or a little bit later, outside of that window. This person was of course then treated with absolute disdain by the online community which was discussing the issue; he might have even been banned outright for saying that. ~~~ peterashford With respect, people have been coming up with their "common sense" objections to what those high-falutin' scientists have been saying every time one of them says absolutely anything at all. It's "common sense" that Climate Change isn't occurring - or man made. It's "common sense" that we can't effect the health of the oceans - they're just so big. It's also "common sense" that the earth is flat - just look around! It's worthwhile raising and examining alternative hypothesis to explain phenomena but rejecting science out of hand on the basis of folk-knowledge is getting pretty old. Rejecting the science needs a higher bar than anecdotes. ~~~ wtvanhest I'm an environmentalist who believes that we should stop carbon polution and that we should dramatically reduce other forms of pollution as quickly as possible. But... I have yet to read a research report that is convincing that climate change is abnormal due to humans. The report most often cited is a meta report, but I'd rather have something with hard numbers that I can show people. Meta analysis is fine, but not convincing. I'm curious what convinced people here that climate change is likely caused by humans rather than just earths cyclical temp changes that have happened for a long time. I say this as someone who wants to understand it. ~~~ dpark Why do you believe we should stop carbon pollution if you don’t think it’s relevant to global warming? If it’s unrelated to climate change then carbon dioxide release is pretty benign. It actually helps trees and other plants. > _I 'm curious what convinced people here that climate change is likely > caused by humans rather than just earths cyclical temp changes that have > happened for a long time._ What part do you have trouble believing? We’ve raised the carbon dioxide level by something like 40% since the industrial revolution. Unless carbon dioxide levels are somehow divorced from the amount of heat the atmosphere captures, we have clearly caused at least a portion of the increase. Scroll down to the long term historical chart to see what we’ve done relative to historical concentrations. [https://climate.nasa.gov/vital-signs/carbon- dioxide/](https://climate.nasa.gov/vital-signs/carbon-dioxide/) ~~~ lopmotr That's the kind of argument that sounds OK only if you already accept the conclusion. Here's another one - The amount of radioactive material in the atmosphere is higher since nuclear testing than ever before in human history. Radioactive material causes cancer so we've clearly caused at least a portion of higher rates of cancer. Call to action - We must all do our part to help clean the radioactive material from the air so we don't die of cancer! ~~~ dpark > _Radioactive material causes cancer so we 've clearly caused at least a > portion of higher rates of cancer._ I don't think anyone argues that radioactive testing has significantly (or even measurably) increased the background radiation level on Earth. Contrast with carbon dioxide, where we have increased by ~40%. If we had evidence that nuclear testing _had_ significantly increased background radiation levels, then the assertions that it's increased cancer rates and that we should try to clean it up would both be true. ~~~ smueller1234 Err, wait a minute. Atmospheric nuclear weapons tests HAVE measurably increased background radiation. Source: See for example both the table and the C14 graph at [https://en.m.wikipedia.org/wiki/Background_radiation](https://en.m.wikipedia.org/wiki/Background_radiation) Source 2: Grad degree in physics. Worked at nuclear physics lab. ~~~ dpark Well then. I was not aware that we had done that much with our nuclear testing. Obviously then we did probably cause an increase in cancer. On the bright side it looks like we’re almost back to baseline. ~~~ southern_cross Be careful about making too much cancer attribution here, though. For one, naturally occurring background radiation levels can be (and often are) far higher than most people realize, and our bodies have long ago adapted to that as best they can. Two, in addition to other pollutants, burning coal also releases quite a bit of radiation, far more than you average nuclear plant or whatever. Third, for the longest time a dose of radiation received all at once vs. that same dose of radiation received over time were treated as if they were exactly the same biologically. But now that we have actual long-term results to go by, it turns out that radiation received over time is generally somewhere between 10x and 100x less dangerous biologically than if that same radiation were received all at once. Also, it matters a lot what the type of radiation received is and where you receive it. For example, radiation received by the skin (dead tissue) may be all but inconsequential, while that same radiation received in the lungs or gut or bloodstream (live tissue) may potentially be disastrous. ~~~ smueller1234 Those are all good points and for anyone reading along with curiosity, I'd actually recommend the Wikipedia link two posts up as a nice starting point for learning more. Wikipedia on this general subject area isn't too shabby, I thought when I last went down the rabbit hole. ------ jboggan This isn't surprising to anyone who dives or spearfishes. You start to notice the body language of the larger fishes as being completely different than their younger counterparts. Only the wary get to be that old. I would also say that applies not just to human predation but also to predation from other fishes and mammals, you can see their movements and behaviors change immensely when something larger and hungrier is about. ~~~ bonesss There are brave cod, and there are old cod, but there are no brave, old, cod... ------ mlboss I currently reading "Sapiens: A brief history of Humankind". Through out human history we have destroyed ecosystems. We are responsible for extinction of 1000s of species. Oceans was left out because we didn't have the technology, but not for long. ~~~ corporateslaver That’s what happens when you have this many people. For the kind of population growth, it’s either we stop having so many people, or the animals die. It’s us or them. Why is this so hard to understand? No amount of talking points and musing on environmentalism will take away those facts of human existence. How could the industrial boom in China or the USA in the late 1800s and early 1900s have happened without emvironmental destruction? How can growth happen in China now without it? Get real about human development and the necessities of human development. ~~~ windows_tips >How could the industrial boom in China or the USA in the late 1800s and early 1900s have happened without emvironmental destruction? How can growth happen in China now without it? Possibly if they had used solar-thermal power generation and focused on battery tech instead of exploiting petroleum so heavily. ~~~ corporateslaver Sarcasm? ~~~ windows_tips There was already heavy investment in steam technology. Using the sun the boil water instead of coal or wood is much less resource intensive and completely eliminates external supply chains. ~~~ codeisawesome Did they have enough tech by that point to even begin to figure out mass manufacturing of solar cells to focus enough energy into water to make steam? Burning something that burns well is a much more intuitive and scalable process... Also, mining the easy stocks (at that point) is a low tech affair too. Which was powered by... combustion of fuel and human labor. We still can’t make solar panels with 100% efficiency, what hope did they have? ~~~ windows_tips You don't need solar cells to heat water to boiling, just glass or a metal like aluminum or silver. With glass you can build a lens to concentrate light. With a metal you can build a concentrating mirror. ~~~ codeisawesome Good points. But lens building is a complex endeavour that requires tooling at the manufacturing level (especially computing power!) - no matter the choice of available material, yes? Doing it with high precision at a civilisation- powering scale seems like an impossible ask of a species that was still considering much of their Home planet to be “new” (Americas). ~~~ windows_tips Galileo was making lenses hundreds of years before the time we are talking about. Also, eye and magnifying glasses existed then. ------ pvaldes I'm sceptic about that, specially after reading this. > All the species in which older, bigger fish are found in deeper water have > something else in common: we eat them Hemm, not. Poisonous fishes show the same pattern. There are poisonous fishes also in the deep sea. If we do not eat then, why do not live in the surface? Mola mola is the biggest extant bony fish. Is neither fished nor eaten normally. Adults still pass most of its time at deep waters. To extrapolate the behaviour of 30.000 species of animals after one single species and one single variable leads often to wrong conclusions. Fishes are complex creatures. ~~~ biofunsf Are you just saying that there are poisonous deep sea fish? To be a counterexample it needs to be a poisonous fish that exists in shallow waters but whose older members only occupy deep waters. Entirely deep-sea species aren't what this article is referring to. I'm not aware of any poisonous fish that aren't hunted by humans that follow this pattern. Just saw your edit. I don't think that sunfish fish of different ages spend their time at different ocean depths? Wasn't mentioned on wikipedia. ~~~ pvaldes Many deep-sea fishes have buoyant eggs and larvae that can reach the photic zone. Even the species that do not reach the photic zone live in shallower areas when younger. A lot of pelagic fishes start their life at 1cm under the ocean surface also. ------ olliej Well yes, there a fairly simple bit of selective pressure:fish nearer to the surface are more likely to be caught. As industrial fishing is aggressive to mass collection that increases pressure to move deeper. It seems reasonable then that fish that are generally remaining (in average) deeper than there counterparts will (again on average) live longer, and so get bigger. This is basic evolution. It’s no different than commercial fishing farms having to deal with their fish growing more slowly or simply staying small (the selection criteria being “stay less than X cm”) ------ vfc1 It's surprising that fishes adopt that behavior, but then again industrial fishing activity now covers at least 55 percent of the world's oceans - [https://www.sciencedaily.com/releases/2018/02/180222162124.h...](https://www.sciencedaily.com/releases/2018/02/180222162124.htm) ------ dalbasal Tangental: " _the researchers added a simulation in which the depth and mass of fish were tied to the rate of mortality by fishing_ " Is this really a process where simulation->result->theory. IE died the model or simulation played a key role indeveloping or validating a theory. Or, is this a process where they started from a theory (that works, explaining observed results) and then built a simulation that demonstrates the theory? I feel like I'm seeing this kind of language in the pop science press, maybe starting from the climate/weather world. Have models of ecosystems (eg) been getting more use as "theory generators," are researchers creating more models ex post as an supporting evidence, or is this just a shift in journalistic language? ...and... if the answer is "models are really playing more of a role"... please elaborate if you know anything interesting. What software is being used? Are biologists building these models/simulations themselves, or are there specialists available to them? ^ I'm not hating on language choices or ex post evidence gathering. Darwin used what we might call a "simulation" (or statistical analysis, depending on the journalist) to predict and then validate a theory about bird mortality in storms. That was great science, and foundational. Not hating. Just curious about changes in the role of simulation/modeling. ~~~ marcus_holmes yeah, every time I see "we proved this theory using a simulation..." I get a little voice in my head that says "then you didn't prove anything except your ability to create a simulation that agrees with your pet theory" I understand that conducting experiments with ecosystems to prove why they behave in particular ways might be a tad tricky. But that just means that theories about ecosystems are hard to prove, not that they get a free ride to "proven". ~~~ dalbasal A more charitable way if putting it might be, is the simulation part of the theory or part of the evidence for it. Both are acceptable, but not at once. Simulation as theory is "safer" in a poperian sense. ~~~ marcus_holmes good point. The way the article presented it, the simulation clearly appears to be part of the proof. But that could be a product of the journalistic filter applied. ------ yitchelle This reminds of the "empty box problem solved by a $20 fan" story. Sometimes, the most obvious reason is probably the actual reason. [0] - [http://cs.txstate.edu/~br02/cs1428/ShortStoryForEngineers.ht...](http://cs.txstate.edu/~br02/cs1428/ShortStoryForEngineers.htm) ------ AdamM12 Wasn't there a post earlier that more mammals are becoming nocturnal because of humans? ~~~ tylersmith I didn't see that and can't find it so if anybody has a link that'd be great. I'm no biologist but have always been taught that this is the case. Specifically, while growing up on ranch I was told that coyotes near the house hunted at night while the coyotes that lived further out in the fields hunted during the day. ~~~ AdamM12 I guess I could of not been so lazy and used the search for "nocturnal" and found it. God forbid it be the first result. [https://news.ycombinator.com/item?id=17329264](https://news.ycombinator.com/item?id=17329264) ------ smaddox Reminds me of this NPR article from 2014 with pictures showing how small "the biggest catch" is today compared to 60 years ago: [https://www.npr.org/sections/krulwich/2014/02/05/257046530/b...](https://www.npr.org/sections/krulwich/2014/02/05/257046530/big- fish-stories-getting-littler) ~~~ SteveGerencser Part of this is because of the changes in catch laws. For example. Many of the fish in the oldest photos are Goliath Grouper. Nearly 30 years ago harvesting Goliath Grouper was banned. So it makes sense that you would see far fewer pictures of a fish you are no longer allowed to catch and keep. ~~~ codeisawesome 1965 (mentioned in the article as being the start of a period in which pictures show smaller fish) is 50+ years ago however ~~~ taeric I'm not sure that disagrees with the point. There was a species of fish that was particularly large that was caught. It was identified that we were damaging that species, so we banned catching it. It makes sense that part of the evidence for banning, was that we were finding less of them, especially at size. That is, the point is that this metric was largely driven by that fish species. ------ cerealbad climate change has become an apocalyptic orthodox religion for secular pessimists, just like technology is an idealized utopia for materialistic optimists. this tends to happen when you build a world view on a single pillar and it starts sinking and tipping over, prophecies and pronouncements of certain doom or visions of heaven naturally follow. the real issue is that modern education is not cross disciplinary so you have highly specialized people creeping into fields they don't understand and making basic errors of judgement based on pre-conceived ideas of what ought to be. you see this for example in the interplay between physics, mathematics, computer science and engineering. the physicist wants statistics over big data sets, the mathematician wants abstract tools and automatic proof theorems, the engineer wants practical numerical solutions and modelling and the computer scientist is fixated on throughput and efficiency of generative algorithms. is this even true? has anyone stopped and said, what are we actually doing? is there a bigger plan here, are we trying to work towards a common goal, or has the ladder we climbed been pulled away? the relativistic revolution and subsequent collapse of the western world, the church, the state and the individual all dissolving into a jumbled mess of professional agitators applying some bayesian strategy of influence maximization is an unfortunate consequence of a change in world order, from west to east. being on the wrong side of the cyber curtain is a slow bleak dawn, with the intellectual and political elite always the last to realize the surrounding darkness. globalization is an example of a desperate last gasp of a former dominant culture trying to exert control and influence over a world increasingly more alien and separate from it. the irony of course is in the conceit of the name itself. project the growth of asia and africa forwards for another half- century, then work backwards and try to understand the hysteria behind fossil fuel usage and the impact this will have on the balance of world power if 3+ billion new people become electrified and industrialized. ~~~ marcus_holmes I'm in SE Asia at the moment, and the view is different here. Globalisation doesn't look like the last gasp of a former dominant culture here, it looks like the way out of crushing poverty. People WANT to work in globalcorp garment factories. They get really annoyed if the factories shut down. For all our western idealism about a pastoral lifestyle, everyone who has the choice between growing food for a living and working 90 hours a week in a factory chooses the factory every time. And yes, it is their choice. They can leave whenever they want, there are 4375979 other applicants waiting at the gates. The interesting thing is that they're skipping steps in the process. Cambodia skipped desktop/wired computing/telecoms and went straight from nothing to mobile (in about 5 years). Mobile access is hugely cheaper and generally faster than it is in either the UK or Australia. I can see solar power doing the same in the next 5 years. Rather than waiting for the electricity authority to get round to wiring up the rural areas, they're already getting on with rigging up their own solar. In 20 years they'll be self-sufficient and the megaprojects to dam the rivers will be bankrupt. ~~~ cerealbad in your observations, are consumption habits changing the same way with respect to diet? eg. will they (mostly) skip unhealthy processed snack foods and move towards something new, and if that happens how will preventable diseases that plague western countries manifest if at all, the health and drug industries, aged care (family vs business). some things are skipped others might be accelerated. globalization works as long as there is another country down the road you can kick it to, but the axis of power will eventually shift towards emergent markets and those governments will cut out western monopolies out of self-interest, or at least be very resistant to a foreign controlled corporation taking its local profits offshore. japan and south korea have a resistant corp-gov structure due to monopoly, high technology and mass employment creating huge economic leverage over the political system. are chaebols nepotistic fascism, a valid competing business structure, benevolent dictatorships? were not having this conversation in the 1980s and most of the worlds airports and shopping malls look eerily similar today, yet older systems are more brittle and prone to fracturing and new infrastructures tend to be cheaper, deploy faster and require less maintenance. giant malls in america remind me of communist apartment blocks dotted all over eastern european capitals, permanent monuments to a big idea that proved temporary. at least the soviet model serves some present social utility. the hidden cost of solar is of course in the manufacturing and re-use process which is fossil fuel dependent, but se asia has oil and gas fit to purpose. globalization for the western leaders means controlling the development of poorer countries by creating avenues for the corporations they front to come in and build, for the poorer countries i think they correctly see it as a way to accelerate society but not necessarily at the cost of autonomy like some corporate colony of an american or german or chinese conglomerate. people may reduce consumption individually, but over the scale of the planet the food industry alone must be undergoing some giant mutation. it's really hard for me to imagine a future in asia and africa without massive increase in petroleum and it's assorted bi-products. ------ aaronmu Do older and bigger fish evolve to live deeper or do fish that live deeper get older and bigger? Humans think that we can tell cause from effect but this is rarely the case. ------ codeisawesome I love this, it’s a fun story, and especially funny to read all the long- winded science-y sounding explanations that humans came up with rather than consider the self deprecating simple fact that we’re just eating them all. BUT I am definitely not happy that this is going to be one of those weaponised stories that the anti-science brigade will start telling and re-telling everywhere, lending credibility to other lies they believe in. ------ ako The fact that fish are getting shorter is illustrated nicely in the article: [https://www.npr.org/sections/krulwich/2014/02/05/257046530/b...](https://www.npr.org/sections/krulwich/2014/02/05/257046530/big- fish-stories-getting-littler?t=1529558188716) Pretty disturbing. ------ Theodores Once upon a time rivers were clear and full of fish. There was no need for anyone to get in a boat and head out to sea to catch fish. Essentially the fish 'came to us' (not that they saw it that way). With the advent of farming the rivers became full of the soil washed off the land and were no longer clear. Fishing now needed a boat. At first this fishing was easy, however we ate the big fish and made various fisheries a thing of the past, case in point being the Newfoundland Grand Banks, once famous for cod but no longer. So what do people eat now instead of 'cod and chips'? This amuses me as there is no problem for them. They get some other fish shaped object - 'battered' \- and make judgements of how tasty or otherwise it is. Maybe this fish comes from another hemisphere, perhaps it has some unusual name, maybe it tastes 'different'. Marketing conquers all though, this mystery fish caught somewhere off Antarctica will have a fancy name as if it is some deluxe, exotic variety. It is as if the 'chef' went out of his way to offer something more exciting than 'humdrum cod' specifically for them, charging them more for this 'nouvelle cousine' in the process. No discussion of what happened to the cod will be mentioned although the despicable newspaper that will be read with the 'faux cod and chips' will talk about the evils of the E.U. and the Spanish fishermen invading 'our' shores. Soon these fools will be eating the scummiest bottom-feeding detritus that can be scooped from the ocean's depths. They will do it if not every day but as the mood takes them until the day they die. Discussion will be made on how well the faux-cod is cooked and how lovely it was. The thinking won't go much further than their bellies and bowels. Meanwhile, driving in their cars and going on their cheap holidays, these consumers, whom the world's economy depends on, will be pumping untold tonnes of carbon dioxide into the oceans, acidifying them to make life in the seas quite intolerable for our friends with gills. The cruise boats they holiday on and the container ships that bring them their cheap junk from China will also make so much noise in the oceans that any fish with ears will not be able to hear anything but the sound of diesel internal combustion engines. There is no penalty for complicity in this madness. However, the cost of stepping outside of it is quite high. If you don't eat your fish or your beef or your mechanically removed horsemeat then there is not a place at the table for you. Unless you have got a good excuse. But nobody says they are 'allergic to beef' or 'allergic to fish'. You can only be allergic to peanuts. Much like how folks in the mafia have to have slaughtered someone to get a space at the table, so it is with meat and fish, to be accepted by your peers you have to partake in the ecocide. There are those that are vegetarian and wanting to live life differently. But these people are somewhat drowned out by the vegan crowd, many of whom have entirely different motivations. This new vegan crowd are more likely to have an eating disorder that has nothing to do with saving the planet. Plus they can't get five minutes into a conversation without wanting to tell all that they are vegan. Their fanaticism plus the cost of their fake-meat products, e.g. that organic almond milk flown in from California, is no example for others to follow. We will see if the meek inherit the earth, maybe the fatties scoffing all the fish are going to be going extinct shortly after they have scoffed the last plastic-infested jellyfish they can find in the oceans. ~~~ bbarn What you've just described is less an argument for vegetarianism and more an argument for population control. If the fish come to you, and you're incapable of overfishing them, your whole fantasy stops. That said, you're right in that vegetarianism is just not sustainable in this size a population - but not just for elitist vegan snobs. If we all ate mostly vegetables, and got our proteins from other sources, we'd be facing a scarcity of land to grow crops as well as sources of fertilizer. The math just doesn't work at scale, and that's why purely vegetarian nations are generally that way: because they are poor, not enlightened. They also aren't usually particularly healthy. Ironically the healthiest cultures (if you'll permit the generalization) are those eating mostly fish-based diets like the Pacific and Mediterranean. ~~~ greglindahl Vegetarianism doesn't scale? Right now most Americans eat meat that mostly eats food like corn and soybeans. How would it take more land if people got their protein from tofu instead? ~~~ bonesss Also the meat we eat eats food. That's how it grows. There's an obvious caloric inefficiency there, if we're struggling. Personally I think vat-grown meat is gonna be the ideal win/win: low environmental footprint, perfect high quality kobe beef for all :) ~~~ hkyeti Why wait? Excellent plant-based foods are available now, and the improvements in quality, taste and cost are accelerating. ------ the-pigeon Do we not count as prey? ~~~ rjpr I think we're predators, not prey. ~~~ rarec We are most certainly predators. There's no animal on this planet a human couldn't beat with some combination of tool or tools. ~~~ bhhaskin We are the ultimate apex predator on this planet. ------ titzer Under essentially any accounting system, when it comes to the environment, we're a bunch of fucking assholes. ~~~ noir_lord We really are but I'm not sure what to do about it (other than considering environmental policy when voting). We don't drive (me or my partner), We recycle, I mend everything I can (which outside of a clean room is most things, it's never been easier with youtube), neither of use go in for excessive consumption (My current desktop is from 2011, it's finally coming up on time to replace that). I even considered going vegetarian but with Crohn's that isn't really viable (I have to avoid high fibre _and_ getting enough minerals/vitamins is a struggle). We don't eat wild caught fish (or much fish at all). I mean I'm not a tofu eating ecowarrior or anything but it just seems sensible not to piss in your own drinking water. ~~~ protonimitate I don't mean this to criticize attempts at personal responsibility when it comes to sustainability, but I think we are far past the point of no return. Short of a massive population decrease, a complete switch in energy consumption, or some sci-fi type of technological breakthrough, we are doomed to run this planet into extinction. At this point, most conservation efforts are more of a attempt to feel better as an individual then they are actually making a difference. That doesn't mean we should stop, or concede, with our efforts. It just means we need to be realistic with what we do over the next 50-100 years. Personally, I think the only way the human race will continue to thrive is to figure out how to survive without the Earth as a host. Whether we figure that out, or succumb to extinction, is really the ultimate test of our race. Were we designed to be a part of nature as it exists on Earth, or will we surpass it? ~~~ InclinedPlane This is a defeatist and ahistorical attitude. We've brought species back from near extinction, we've brought entire biomes back from near destruction, we've made the air and water cleaner, we've saved the ozone layer, etc, etc, etc. We know how to do these things, it doesn't require magic or impossibly futuristic technology or excessive personal sacrifice. It's just work. All we have to do is do the work. ~~~ oldcynic We've forced species so near to extinction that they need to be "brought back", we've had such a profound effect on the environment that there is essentially nowhere left that does not show the impact of humankind. We've made the air and water cleaner in highly specific ways whilst continuing to pollute globally and watch the developing world repeat our mistakes on a larger scale. We saved the ozone layer yet have thus far done nothing substantive as far as carbon is concerned. We've diligently avoided any serious discussion of population pressure on the planet. We know how to do these things, we know how to avoid the mistakes of the past. Yet we are not doing them. How do you propose getting the international community to do that work?
{ "pile_set_name": "HackerNews" }
A year with MongoDB (The Good & the Bad) - anthony_barker http://data.story.lu/2012/04/14/a-year-with-mongodb ====== anthony_barker The followup is in this post - Basically they migrated a portion of their data to Postgresql and Riak. [http://blog.engineering.kiip.me/post/20988881092/a-year- with...](http://blog.engineering.kiip.me/post/20988881092/a-year-with-mongodb)
{ "pile_set_name": "HackerNews" }
Super-simple way of serving assets in Django that’ll actually perform well - flexterra http://kennethreitz.org/introducing-dj-static/ ====== teilo About the only advantage I can see doing this is not having to specify any static directories in your Apache or Nginx config. What is the use case that makes serving via Static preferable?
{ "pile_set_name": "HackerNews" }
American Railroads Are Already in Recession with No End in Sight - toomuchtodo https://www.bloomberg.com/news/articles/2019-10-07/american-railroads-are-already-in-recession-with-no-end-in-sight ====== toomuchtodo Outline: [https://outline.com/zgawAx](https://outline.com/zgawAx)
{ "pile_set_name": "HackerNews" }
Don’t Say Goodbye Just Ghost - hourislate http://www.slate.com/articles/life/a_fine_whine/2013/07/ghosting_the_irish_goodbye_the_french_leave_stop_saying_goodbye_at_parties.html ====== Multicomp I will say, the "Northern Irish Goodbye" referenced in the article of announcing your intention to ghost seems sensible enough, depending on the circumstances. You can't say that at a baby shower though, where baby shower = any given pseudo-formal event where the crowd has expectations of you.
{ "pile_set_name": "HackerNews" }
Ask YC: Favorite email parsing library? - cduan Any recommendations on a good library for parsing emails in mbox format? I'm pretty agnostic as to language (I don't know Python, but if the library is that good, I will learn it). I mostly care that it takes care of all the language encoding business for me. I've been using the Perl Mail::Box suite, and language encodings with that is just a mess.<p>Thanks everyone! ====== skwaddor leave the parsing to someone else and use plan9's upas mail file system <http://doc.cat-v.org/bell_labs/upas_mail_system/> <http://www.quanstro.net/plan9/nupas.pdf> now available for plan9port on Unix we don't need no stinking APIs and Libs, give us grep awk sed and cat ------ zeantsoi Best parser I've found for PHP is the Pear mimeDecode library. Takes a bit of time to figure out the header parsing but it's pretty decent at handling the UW torture test. ------ grandalf rmail for ruby totally rocks for parsing emails. I haven't used it for the mbox format but I think it can handle it. ------ diN0bot hmm, i've parsed email before. is there something specific to email encoding than general files? such as parsing the "encoding" header thingy? anything else? ------ uriel upasfs(4) <http://man.cat-v.org/plan_9/4/upasfs> Edit: Damn, skwiddor beat me to it :)
{ "pile_set_name": "HackerNews" }
Smart Kids Should Skip High School - exolymph http://sonyaellenmann.com/2015/09/why-skip-high-school.html ====== mjevans Could I instead have had the curriculum at my own pace? Really it should just be a national, fully open, stack. No expensive text books (there's a nice racket) / etc. Not just high school, but all of it. If we need to have 'learning centers' where parents can warehouse their kids while going to work, lets just fund that with those results in mind, and have additional tutors (maybe other students for extra credit, based off of the realistic improvement rate in the grades of those tutored) for those who aren't advancing quickly enough. Of course, full boarding that the kids can opt in to (for predictable meals and escape from abusive situations) could also help. ------ clessg I dropped out of high school in my junior year. Predictably, most people remarked that it was a poor decision and that my prospects of employment would be forever decimated. I skipped ~50% of days in grades 8-10, but somehow passed. I spent those days coding and learning on my own time. As school got progressively less challenging, I dropped out in grade 11 and decided to stop wasting my time. Finally, I could pursue my passion in software full-time. That additional time and freedom was instrumental in achieving my dreams. It was almost certainly much more valuable than more wasted years of high school. By the age of 19 I was already earning a very significant sum of money and had years of solid experience. But who knows? Maybe I missed out on some important friendships or valuable connections. Perhaps my lack of a high school diploma will be an insurmountable barrier in the future. Although I believe it was the right decision for me (who knows), I don't think it's the right decision for the vast majority of people. You must have discipline and passion. You must be your own boss and have a plan. You should enjoy learning on your own. Treat time as a precious resource. I am fortunate that my chosen field, software development, generally does not require a diploma or degree. Exceptions exist, but I'm not interested in those jobs. Any other field and my decision could have been disastrous. ~~~ TimPrice > You must have discipline and passion. You must be your own boss and have a > plan. You should enjoy learning on your own. Treat time as a precious > resource. Sadly, most of that is not taught at junior/middle school. Getting out of the labor force factory that 'education' is in most countries, may not be the best idea even for the majority of 'smart kids', as you say. ------ visarga I don't think kids should skip high school, but I think parents could let up on expectations placed on children to compensate for the uselessness and tedium of much of the subject matter. On the other hand, engaging in passion driven projects that also involve self directed study is the way to go. Parents should encourage the latter while being less fussy about grades. Grades don't matter anyway 10 years later when the child will be hired or try to make a business, but a few passion driven projects could go a long way towards later success. ~~~ quanticle The problem is college admissions. Sonya says, "Skipping college is almost middle-class mainstream at this point," but she's dead wrong. It might be mainstream among the lower classes, but they aren't doing too well these days, are they? And it's mainstream among a certain fraction of the upper-classes who have enough of a safety net in family wealth and connections that it doesn't matter. But for the vast majority of the middle classes, the true middle classes, college is still very much a mainstream choice, and increasingly important as a backstop against falling into the lower class. If you don't have a high school transcript, your chances of getting into a college go down by quite a bit. As for, "I could have spent three years writing and reading and working on interesting projects, instead of enduring the sociocultural hell of high school," I nearly giggled when I read that. Maybe Sonya was among the special 1% of kids who would actually have done that, but realistically, if I had skipped high school, I would have spent my days playing video games and getting into trouble, not "reading and working on interesting projects". ~~~ thaumasiotes Given that I spent my high school years reading anyway, I feel pretty safe in saying that had I skipped high school I would still have spent the time reading. There's nothing about going to high school, or having any other occupation, that precludes you from reading. I would have spent most of the time playing video games, as I in fact did during college, but I wouldn't have done any getting into trouble. This would have been largely unproductive, much like attending high school was. ------ thinkingkong Im going to take a different position. I think school is valuable. But only because it informs you about a curriculum and subjects you might not know exist. People are smart. But people need to figure out what they can be geniuses at. If we remove opportunity in this way, we absolutely must find an alternative to telling people what they dont know they dont know about. ------ twotwotwo I half did this: after sophomore year, I left high school for what's now called Bard College at Simon's Rock, a college specifically for younger kids. tl;dr if you think you might be ready for college you should look at it. There are scholarships. I have no regrets about that. It's as demanding as you want. About 2/3 of kids there get an associates' to transfer to another school after two years; those that stay all four do a B.A. thesis. Some folks who transferred reported being bored at their new schools, but I don't have much of a sample. :) I'd probably have had a slightly shinier-looking résumé if I'd gone on through high school, but there's more to life than that. For me the résumé thing is moot (the Rock actually helped; I got work through a teacher referring me to an alum), and plenty of classmates have done well in tech or law, become doctors, gone into research (including in hard sciences), etc. It produces a good number of high achievers for its small size and high acceptance rate, probably because kids interested in rushing into college are a sort of funny pool already. Occasionally bureaucracy requires some grad to take a GED exam because they left high school, but that's straightforward. For high-school me, the focus on the liberal arts and the beliefs of folks there really contrasted with how I looked at the world at the time. Same would be true for a lot of folks here I figure. The tension from being exposed to something different was productive for me. Sounds icky I bet, but what you need to learn is not always the stuff you come in wanting to learn. Anyhow, the site is [https://simons-rock.edu/](https://simons-rock.edu/). Hope this is useful to folks. ~~~ qrendel The problem is that's a small and selective school that only accepts a few hundred people. It's hardly a solution for the entire country, much less world. While it lists an 89% acceptance rate, that's out of only 199 applicants and with a total student body size of 329 (in 2015). It's an "elite" school for "elite" kids to basically start college early. Possibly a model for other schools, but it alone doesn't scale to the rest of the population. Full disclosure: I applied when in high school, remember getting the full tour and sample class, etc, but was rejected, imo because I pretty clearly didn't hit it off with the interviewer. It seemed like a very nice place though, if you can get in. ------ waterhouse Very smart kids should skip multiple grades at least. [http://files.eric.ed.gov/fulltext/EJ746290.pdf](http://files.eric.ed.gov/fulltext/EJ746290.pdf) "A 20-year longitudinal study has traced the academic, social, and emotional development of 60 young Australians with IQs of 160 and above... The considerable majority of young people who have been radically accelerated [skipped 3+ grades], or who accelerated by 2 years, report high degrees of life satisfaction, have taken research degrees at leading universities, have professional careers, and report facilitative social and love relationships. Young people of equal abilities who accelerated by only 1 year or who have not been permitted acceleration have tended to enter less academically rigorous college courses, report lower levels of life satisfaction, and in many cases, experience significant difficulties with socialization. Several did not graduate from college or high school." ~~~ mrkgnao I would guess Terry Tao was one of them? ~~~ waterhouse Correct. In case studies of a subset of 15 of these kids, he was one of three with a 200+ IQ. What happened to the other two? "Christopher Otway"'s story is described in the link. "While in Grade 1, Chris was accelerated to work with fifth-grade students for math and sixth-grade students for English. The following year he did math with seventh-grade students ... at the end of his second-grade year Chris made a full grade skip to fourth grade but took math with the eighth grade. By age 12, he was theoretically enrolled in 9th grade but took five subjects (physics, chemistry, English, math, and economics) with 11th-grade students 5 years older than he." Eventually, "He entered university at 16 years 2 months, graduating with Bachelor of Science (First-Class Honours) in computer science and mathematics at age 20. Chris won a scholarship to a major British university and graduated with a Ph.D. in pure math at age 24. Since then, based in London, he works for a worldwide consultancy assisting other companies with financial strategies." As for "Ian Baker": starting with preschool, "only after his parents had gently informed the teacher that he had just finished reading Charlotte's Web was he permitted to forego reading readiness exercises". In grade 1 and 2, his teacher worked with him to give him stimulating material, ranging up to grade 8 math, and he was in a pull-out program with other gifted children; however, in grade 3, a new principal canceled the pull-out program, and Ian's teacher permitted him to work on a grade 7 math textbook "with no guidance or assistance, and no other children to work with", and Ian lost interest. Ian was bored and miserable in grades 3-4. He switched schools, and partway through grade 5, was given a high-school math teacher mentor, who took him through math up to grade 9. In grade 6, he took 10th grade math, and the following year skipped into grade 8, taking 11th grade math and CS and 10th grade science. In grade 10, he began taking university math. Ian enrolled in university at 17, did a bachelor's degree in "computing systems engineering", graduated at 20, and was in his fourth year of a Ph.D in digital hardware design at 24. According to the author of the study, "Ian Baker's mathematical ability is certainly on a par with that of Christopher Otway and may well equal that of Adrian Seng [Terence Tao's pseudonym]. Unlike Adrian and Chris, however, his astonishing potential has largely been ignored by the education system ... It is unfortunate that he had to suffer through four years of appalling educational mismanagement." The author also says, "Adrian is the only child of the 15 who believes that he has been permitted to work, at school, at the level of which he is capable." What was Adrian's educational program? At age 3.5, he tried entering preschool, but couldn't cope with a full school day at that age and left. At age 5 (by which time he had done all of elementary school math in home study), he entered school again. His parents worked with the principal to design a flexible program, in which he was able to progress through two grade levels per year. At age 6.5, he was attending grades 3, 4, 6, and 7 in different subjects. At age 7.5 he attended high school for part of the day, doing grade 11 math; the rest of the day was spent in grades 5 and 6 at the elementary school. At 8 he was doing math, physics, English, and social studies at high school (variously at grade levels 8, 11, and 12). At 8.5, "having informally sat and passed university entrance mathematics", he started taking university math, first in independent study and then guided by a professor; at age 8.75, he stopped attending any elementary school classes, and spent 3/4 of the day at high school, doing sciences in grades 10, 11, and 12, and humanities and "general studies" in grade 8, and 1/4 at university. By age 12 "his studies at university included fourth-year algebra, second-year physics, and second-year computer science". "By age 14, he had passed university entrance examinations in mathematics, physics, chemistry, biology, and English, and completed university courses in areas such as mathematical physics, quantum mechanics, discrete mathematics, linear and abstract algebra, Lebesgue integration, electromagnetic theory, optics, and several areas of computing science." (source: "Exceptionally Gifted Children", 2nd edition, by Miraca Gross) It is, shall we say, interesting to imagine what might have been, if all the kids in the study had been afforded such a program. ------ s_m_t I agree. Despite going to one of the best public high schools in the country I don't remember learning much of anything. Looking back on it it was more like a very pleasant teenage internment camp. Parts of it were fun, the drugs and alcohol being a highlight. Going to school high or getting high at school was pretty fun. Sometimes competing with other students on the tests without studying or doing any of the homework was fun. But I imagine I could have had a lot more fun and learned a lot more elsewhere. ~~~ dalke I went to one of the better public high schools in my district, so not one of the best schools in the country. I learned quite a bit. Both the US history and European history teachers were very good, the calculus and differential equations courses were an excellent base for college physics and applied math, the computer class marked the start of my transition from an avid hobby programmer to a software developer, and drafting changed the way I understand a building. As it happens, I'm still interested in history, math, science, programming, and building design, so perhaps I remember those best because I found them interesting. You were interested in getting high and competing with your friends, so perhaps that's why you remember that part best? ~~~ calibraxis That last paragraph would sound unpleasant if it replied to me. :) That poster can take solace that they're in good intellectual company: _" In fact, I can remember a lot about elementary school, the work I did, what I studied and so on. I remember virtually nothing about high school. It’s almost an absolute blank in my memory apart from the emotional tone, which was quite negative."_ — Noam Chomsky ([https://chomsky.info/reader01/](https://chomsky.info/reader01/)) ~~~ dalke I can see that interpretation. Let me try another way. I remember nothing about my government or human health class, very little about my sociology class, I try to forget everything about my senior year English class, and I took Spanish only because there was a state scholarship where part of the requirement was to take three years of a foreign language, instead of the state-mandated two years. I surely learned things in these classes. I've likely forgotten them because the contents didn't interest me enough. If I regard the brain as RAM, there was no refresh so the memories faded. Just like I remember very little now about most of my college classes. I took numerical analysis, thermodynamics, theater, psychology, database organization, and more, but I've forgotten them. Even my analysis course, where I remember I adored doing epsilon-delta proofs, has faded into all but emotional tones. I've also forgotten how to do PDEs, but knowing that technique saved my butt a few years later when I took the qualifying exam for physics. While I vividly remember my theory of automata course, my discrete math course, and a few others which are still so key to what I do every day. It seems I remember best the things I like, and the things which I still use. I can totally understand that someone who liked high school because of friends and getting high might only remember that part. That doesn't have to mean there was no learning, nor even that the learning wasn't useful. Only that it's no longer recallable. ------ LyndsySimon In the context of the article, my wife and I are being even more extreme - our kids are skipping _all of it_. We're unschooling. Our oldest is seven, and shows a ton of potential. Instead of trying to sit her down every day and "teach", we literally just gave her a pile of books and told her to have fun. Two or three days a week she tends to sit down while her younger sister is napping for the sole purpose of learning. That's usually math, but it could be anything. She's starting to take on small projects of her own design as well, which is exciting. For example: we were out walking on a popular trail last week, and she emptied her water bottle. She decided it would be a good idea to sell water and snacks on the trail. Without input at all from us, when we got home she started listing out all the things shed need to set up a mobile food cart of sorts, with cold drinks and snacks. Then she shared it with me and we worked together to polish the idea a bit - I suggested she use a wagon instead of a stroller, and limit herself to bottles of water and prepackaged trail mix. She will probably see that particular project through to completion, break about even on it, and move on to something else. She did that in the past with a lemonade stand at our apartment complex. As for her progress, her math skills are significantly ahead of her peers. Her reading ability is something I think needs improvement, but we test her twice a year and they show that she's at parity with her public-schooled peers. She's picked up reading very quickly in the past 2-3 months though, and I'm very interested to see how she places this time around. So, yeah, skipping high school doesn't sound all that crazy to me. What the author calls "smart kids" \- which are in reality intelligent kids with the advantages of a good familial support system - can make much better use of their time than sitting in a classroom waiting for the bell to ring. ------ visakanv I hesitate to describe myself as a smart kid, but I wish I couldn've skipped most of my formal education. I ended up making a living from the things I was doing outside of school, and wish I had been able to pursue those things more, and with less guilt and shame. I'm still unlearning a lot of the bad habits and dealing with the anxiety I developed when trying to deal with school. Sigh. It's not a huge deal, there are worse problems in the world, but it upsets me to know that there must be others like me going through this year after year. ------ HytDskUYS66 I don't get the trend of being all cool and dropping out of High School or College or whatever. High School gets out at 2 or 3 pm, leaving plenty of time to learn on your own or build projects. At that age are you really going to start a business? Besides, school provides motivation to learn, especially Colleges, because you are paying for it and don't want to fail classes. ~~~ PretzelFisch high school does not provide motivation. That comes from the students that know what their next step is and want to get into college. ~~~ dalke Umm, you realize a lot of people are motivated to graduate from high school because they have career plans which _don 't_ require college, right? There are entire high school programs of vocational, technical, and career education for those whose idea of a "next step" is different from an academically oriented career that most colleges teach. There are also people who graduate from high school because their goal is to enter the military, either as a career or for specialized training, and the military requires a high school degree. Also, by the same logic, college does not provide motivation either. ~~~ PretzelFisch I thought I had more emphasis on next step when I wrote the reply. I don't believe school is ever the motivation. Those that are motivated in school see it as a means to an end. ~~~ dalke In my school district, school was only required until 16. Some did leave school then. _Everyone_ who remained had some sort of motivation, because it was voluntary. ------ cel1ne School education often has a leveling effect. It helps people below the average to get better and blocks people above average or brings them down by boring them, leading to avoidance of any real effort. As far as i know most nobel price winners are below IQ 150, because apart from being smart, they learned to work hard and not go through school with minimum effort. ------ rdl I dropped out of high school to go to MIT early, then dropped out of MIT to do a startup, and never bothered submitting the class transcripts to my high school for the gym and English classes I needed to get a diploma, so technically: "highest level of education completed: middle school." ------ pakled_engineer In first grade was given a bunch of tests and determined I had a "grade 12 reading level" whatever that means. I was sent to one of those so-called child prodigy schools but had no interest in anything they were teaching me. Instead I would disassemble every electronic box in the school and hack around with it. They kicked me out of that school for doing nothing the entire time except taking apart electronics and returned me to the public system. First day, when we received our texts for the year I'd read through the entire text on the weekend and do all the exercises. Then I'd just slack off all year, usually only showing up to do the tests and weekly quizzes while spending the rest of my time hanging out with the other delinquents in my school who never went to class. This worked fine until about Grade 10, when I decided to not even bother reading the textbooks on the first weekend of school anymore and just didn't show up or do any work at all. I was solely interested in hanging out with the crazy STS chatboard goths and IRC hackers I befriended who all partied at this guy's warehouse downtown everyday, which I got away with for about 8 months until one day I forgot to intercept the mail and phone calls from my school and my parents discovered I wasn't even going anymore. My routine was to get up and walk to school, attend homeroom to show I was actually there then just take off to go downtown to said hacker/drug dealer's warehouse. There was always a ton of people there it was a defacto hackspace and party house. I learned more hanging around those people for a few months than I did all of high school. When my parents gave me an ultimatum I decided to go squat with a bunch of street punks and just hang around the hackerspace all day. I probably would've been satisfied going to some kind of engineering or compsci program after school if it had existed at the time. I liked hanging out with my friends in high school and was glad I still went to be social but it would've been great if school was only a few hours, and the rest of the day I could have pursued my interests in electrical engineering at a non institutionalized type environment. Hanging with my friends was fun in school between and after classes but everything else about it felt like prison. Somebody you don't elect hands you arbitrary rules to follow and you just end up feeling trapped. I'm sure plenty of other kids don't mind high school for a few hours a day but would much rather spend the bulk of their time learning something else they're actually interested in. I have no idea how this can be accomplished but a full day slogging through half a chapter in a textbook on a subject you have zero passion about isn't it. ------ brudgers [US centric] Some kids should skip some part of some high schools. Smart is orthogonal to the issue. "Skip" is a sliding scale. Early admission to college, home schooling, and starting in a trade are all on it. So is just getting the fuck out of a bad situation. The people who should skip are those for whom the "these are the sizes we carry" [no high school is actually one-size fits all] doesn't carry a healthy size. The likelihood that a part that should be skipped is inversely proportional to it's length: senior year is more broadly appropriate than 9th grade. ------ vhost- I dropped out because the school system wasn't designed for kids with dyslexia. All I ever felt was left behind. ------ agjacobson The writing in this article was disorganized, meandering, and it was not even clear what theses were being defended. ------ joesmo High school in the US is a joke. What did I learn? I learned math is horrible. Wait! What? Yes, school actually killed my passion for math. When I wanted to do matrices, they wanted to do arithmetic. When I wanted to do trigonometry, they put me in 3 (THREE) years of pre-algebra. How much fucking pre-algebra can you have? Everything I attained during and after high school was DESPITE school, not because of it. I taught myself how to program like many here, 3d graphics, sound, music, writing, etc. I wrote a book that got me a full tuition scholarship (despite having no official extracurricular activities) in my spare time. High school was so useless, I went a whole semester doing homework only one time and getting all A's and the occasional B+. And this is at one of the best public schools in New Jersey. Seriously, what a joke. What a waste of time. What torture to be put in there with all the rest of the kids, most of them unlikable idiots. And worst of all, what lost potential during those days. How much more could I have done had I not been forced into this elaborate babysitting scheme we call high school? How many books would I have written? What software would I have developed? What animations would I have created? If only I hadn't been so tired from the daily bullshit of high school ... ------ gscott My son went to Grossmont Middle College which is a charter school for the last two years of high school located on a jr college campus. He was able to get enough college units to save him a year and a half at 4 year college. High school, depending upon your goals and classes can be not the best use of time if you can just jump in and do the college courses anyway... why do double work. ~~~ foolfoolz because taking your time through college can be an incredible experience. ~~~ quanticle On the other hand, parent-poster's son probably saved enough money to buy a car. College credits are incredibly expensive, and if you can do them for free in high school, there's no reason to skip them. On a broader level, I would very much like to disagree with people who say that you should take your time through college. Take your time through college... _if you can afford it_. Don't forget that you're paying a good ten to twenty thousand a year for the privilege of taking your time. ~~~ gscott We are ridiculously poor so money saved is a large benefit. But because he qualifies for the whole fasfa aid and since he saved a year of college he is doing a double major. So getting college units in is allowing him to do more and still be the right age getting out. ------ xiaoma I went to college before high school, starting at age 13. I was a full-time student until the age of 15, when I went back to high school due both financial and social reasons. Going back to high school was great for my social life, but it was horrible for my education. Having already taken a good chunk of a math major, I had to take high school math and science courses because those were the rules. I took a ton of AP courses but still had a poor GPA. In the case of AP chemistry, I got more college credit for passing the exam but an F for the course due to the heavy weighting of homework. Over the next couple of years I lost all respect for formal schooling and became a pretty bad student. On the other hand, I played sports and made some great friends who I still know even now. In all honesty I can't say if the social costs of staying and finishing a degree at such a young age would have been worse or not. My current thinking is maybe it's okay to skip high school, but don't go back if you do. ------ melloclello I wish I had skipped high school. At best it was a waste of time and at worst it was a highly traumatic experience for me. I guess I'm doing better now but the first few years I spent in the real world were spent desperately trying to unlearn everything I'd learned from that experience. ------ 1stop Is this like saying "Rich people should hoard their wealth"? There is something to be said for Smart kids being at school both receive and PROVIDE benefit to/from the other students. But I guess, who cares about dumb kids, who had less opportunity than the smart ones? They'll get a better draw next life! ~~~ dkopi Actually, smart kids usually end up a net loss for classes. They get bored. so they skip classes, or end up sitting in class bothering everyone else. They also end up frustrating other kids because "why does it come so easy for X but not for me?" Bad teachers even use smart kids as examples: "See, you all should be a lot more like smart kid". A smart kid isn't going to magically become a free tutor for other kids at school. Let teachers focus on educating. Don't shift that responsibility to teenagers who could be advancing significantly in other environments. ~~~ 1stop Research shows the opposite. Mixing ability levels increases the median mark. The point of public funded education is not to produce a minority of geniuses but a majority of smart people. ------ coldtea The best thing about school is it being a forced social experience -- preparing one for life itself, and even having them interact with others even if that's not their "thing" (it wasn't for me either). Not the learning. ~~~ MustardTiger What makes that the best thing? All available evidence suggests it is not a relevant thing at all, since home schooled children are indistinguishable in social skills to regular schooled children. The forced social experience of a typical school is very artificial, and does not apply well to many other social settings. The only real similar setting is prison. Unless the goal of school is to prepare children for the social experience of prison, I don't see how that could be the best thing about it. ~~~ coldtea > _What makes that the best thing? All available evidence suggests it is not a > relevant thing at all, since home schooled children are indistinguishable in > social skills to regular schooled children._ I, for one, very much doubt that "available evidence". ~~~ MustardTiger Facts don't really care if you doubt them or not. Please, answer the question. What makes that the best thing? ------ cant_kant School is only incidentally about education. For instance, never underestimate the power of networking at a elite private school. A few random stories, all true: A client of mine is global <role_deleted_to_protect_anonymity> director of a multi-billion dollar Scandinavian corporation largely because a school friend knew a few board members. A friend obtained deep access to a person, considered to be among the five most powerful people in his country, due to his school network. Another friend was contacted by a school friend and helped him to land a job in an elite investment bank despite the lack of any relevant experience. ------ carogeraci So maybe what we need are high schools where students start by learning how to devise a project and work on it. This stage will give students the skill set to do their own project while introducing them to a wide variety of concepts. In stage two, students write the question they want to answer and form a study group to work with. Stage three is completing the project and stage four is some form of publication. ------ teslabox John Taylor Gatto [1] says that it's more important to skip as much of the early years of schooling as possible. If the child spends that time in the real world, nothing of importance will be missed. For me, kindergarten was incredibly boring... Nothing of importance to me was ever learned at school, that couldn't have been learned quicker on my own. [1] [http://www.johntaylorgatto.com/](http://www.johntaylorgatto.com/) ------ cloudjacker There have been times when I felt I would be more productive doing something else, at different phases of life. Statements like this article's really depend on what someone's end goal is. If you are striving for some kind of above average monetary success ... earlier, there's nothing about this that guarantees it. If you aren't, then you need a different end goal to make this useful. ------ enthdegree Dad and mom did this to me and my sister, and dad wrote a blog on it. Here is his post-mortem: [http://chapmankids.net/blog/why-not-skip-high-school-a- serie...](http://chapmankids.net/blog/why-not-skip-high-school-a-series-on- our-experience/) Skipping high school may not be bad idea, even for the not-extremely-gifted. ------ microcolonel I skipped half of middle school and high school, I think it paid off. I have a great job and a jumping off point for whatever ambitions I might have. ------ aufa The main problems that constrain the development of education in a country is the government's policy. ------ nphang Well this is a load of drivel. And all yall are agreeing with it because every single person on this forum considers themselves smart so you like agreeing with people who claim to be smart and know smart people stuff. "so I took California’s GED test in June, 2012. It was dead easy." If that's your basis for calling yourself smart then I'm not impressed. I took the GED high as a kite and got a perfect math score (know what's similar about both our statements? we both sound like jerks). The GED isn't a smartness metric, it's a soap bubble test so white people don't get stuck as fry chefs for their whole lives. It's _supposed_ to be easy as long as you understand the questions. Welcome to the upper class nimwit. "People worry about Google and the instant availability of knowledge making people dumber, because we don’t have to memorize much anymore — Socrates felt the same anxiety when writing and reading were invented." Your events are off by a few _millenia_. "contrary to what teachers and school board members might want you to think, getting into college is easy if you’re intelligent and work hard to do interesting things" False. Just false. I mean, I'm assuming you want to go to a top tier college, not community college. Top tier colleges are for expanding the upper tiers of the gentry class, while community colleges allow entry into the gentry class. To move to the upper tiers of the gentry, you need to have shown that you respect the institution of the gentry, ie. going to high school. Exceptions maybe exist for prodigies or minorities who built a clock once (because that's soooo amazing, who here _didn't_ build shit like that as a kid), but in general skipping high school to learn sewing and greenhaus building is an acceptable path to an Amish lifestyle, not differential equations and the Ivy League. I thought it was generally acknowledged at this point that school isn't about education? School is about socialization, connections, and a prodding to at least have some depth of knowledge about a general corpus that it's accepted people should know about. No one actually expects you to be able to find x in real life, but everyone is familiar with the concept of finding x. I am an _astrophysicist_ and I use, at maximum, 5% of shit found in my physics texts through the year. _School is not about the shit in the books, that is not the point of school_, you're like an atheist telling a Catholic that Jesus was an asshole because evolution -- it's a valid point but it has nothing to do with the conjecture. School is an indoctrination procedure so we don't schism even further into our already highly segregated class based society, saying that it's a waste of time because class and homework are stupid is a correct statement, but misses the entire point. Even anarchists believe in school. ~~~ corser Elite universities are not for expanding the gentry class. They are so that the elite can meet other elite as they are extremely distributed geographically, get married and continue on the class system. The lower class people that go there are aiming to join the elite, but the vast majority of them simply become highly paid servants to the owning class. We like to think that startups made by dropouts establish that anyone can make it big, but for the most part these startups are children of millionaires living off trust funds until they break through; like Trump, no one should consider these rags to riches stories, accomplished though they may be. The main point is that for elites, their schools are so that their girls can bring a nice boy home and not some trash from main street while their boys can find a wife that actually stands a chance at domesticating them. It's really just that most geeks are so focused on their projects that they forget that there are more important things, like their family. Hence why we constantly hear incredibly talented but socially awkward people call for the repeal of school. Because social things don't mean anything to them. They generally become tools of the elite class. I generally agree with the idea of creating environments where people can meet their future family because finding a quality spouse is extremely difficult without expensive and fool-proof filters in place. Connecting it with education is inevitable because children and young adults are nearly useless in a modern economy. We can't make it excellent education because there is a lot of economic and social pressure for truly talented individuals to abandon academia in favor of the market. My main complaint is that the elites are no longer virtuous so they do not deserve to exist anymore. ------ pirdiens i'm 21 and still in high school oh gods get me out of this. ~~~ pirdiens to be more clear on this, i've skipped classes ever since high school started, this was due to being introduced to a computer at young age and having access to the internet - i was amazed by the world and got to see the "oh what the hell this is vile" stuff; it's difficult to fit in my classes. my parents didn't care what i did on there because they had to focus on their job. my mother was, and still is, kind about this. she probably knows the horrors i've seen. so, uh, i guess, i agree with this article even though this applies on where you're from. there are no "smart kids", though, because, imo, everyone is smart.
{ "pile_set_name": "HackerNews" }
How covid-19 conspiracy theorists are exploiting YouTube culture - hhs https://www.technologyreview.com/2020/05/07/1001252/youtube-covid-conspiracy-theories/ ====== thedudeabides5 Seems funny that the "conspiracy theories" that got sites like ZeroHedge banned from twitter are now the very theories being floated by the secretary of state. Say what you will about Pompeo et al's true intentions, but the fact that _truth_ is often not known with certainty at the time of publication is something all media outlets have to grapple more seriously with these days. ~~~ aaron695 Given it's obvious how useful C19 would have been to North Korea. Far more than nukes. Why the hell were we/China not trying to see what North Korea might be up to with Coronaviruses and playing with them in military labs? If ISIS are not looking into this now they also are idiots.
{ "pile_set_name": "HackerNews" }
Docker for Mac and Windows Is Now Generally Available and Ready for Production - samber https://blog.docker.com/2016/07/docker-for-mac-and-windows-production-ready/ ====== senex I've been tracking the beta for a while. I'm confused about this announcement. These issues still seem unresolved? (1) docker can peg the CPU until it's restarted [https://forums.docker.com/t/com-docker-xhyve-and-com- docker-...](https://forums.docker.com/t/com-docker-xhyve-and-com-docker-osxfs- cpu-usage/10537/32) (2) pinata was removed, so it can't be configured from CLI scripts [https://forums.docker.com/t/pinata-missing-in-latest-mac- bet...](https://forums.docker.com/t/pinata-missing-in-latest-mac- beta-1-11-2-beta15/15541) (3) it's not possible to establish an ip-level route from the host to a container, which many dev environments depend on [https://forums.docker.com/t/ip-routing-to- container/8424/14](https://forums.docker.com/t/ip-routing-to- container/8424/14) (4) filesystem can be slow [https://forums.docker.com/t/file-access-in- mounted-volumes-e...](https://forums.docker.com/t/file-access-in-mounted- volumes-extremely-slow-cpu-bound/8076/168) Are these fixed in stable? I'm personally stuck transitioning from docker- machine and (from the comments) it seems like other folks are as well... ~~~ jmspring Sadly, the state of things, be it the Docker ecosystem or others, "ready for production" means something much different than it did years ago. For me, the definition of ready for production, Debian is a good example of the opposite end of Docker. ~~~ rhinoceraptor I think by 'production', they mean 'ready for general use on developer laptops'. No one in their right mind is deploying _actual_ production software on Docker, on OS X/Windows. I've been using it on my laptop daily for a month or two now, and it's been great. Certainly much better than the old Virtualbox setup. ~~~ mherrmann I'm still using VirtualBox. Could you elaborate why Docker is better? ~~~ numbsafari Leaving containers vs VMs aside, docker for Mac leverages a custom hypervisor rather than VirtualBox. My overall experience with it is that it is more performant (generally), plays better with the system clock and power management, and is otherwise less cumbersome than VirtualBox. They are just getting started, but getting rid of VirtualBox is the big winner for me. ~~~ carwyn It's based on the OS X sandbox and xhyve which is in turn is based on bhyve [https://blog.docker.com/2016/03/docker-for-mac-windows- beta/](https://blog.docker.com/2016/03/docker-for-mac-windows-beta/) ------ Todd I just installed D4W a few days ago for the first time. It's been great. It's a seamless experience on W10 Pro with Hyper-V. I've used VirtualBox a lot and I like it but I always have to fuss with the network bridge and such. With this, it's hard to tell I'm even using a VM. Their network port mapping is seamless. FYI, I was using rc4 and I didn't see any information on how to upgrade (should I uninstall first?). I ran the release setup and it did an in-place upgrade, deleting earlier components and such. ~~~ sootzoo My experience has been the auto-upgrade works without requiring reinstall (and indeed just did so to GA last night). I don't think there's any specific fresh install requirement. ------ yladiz I don't use Docker much now, but in my experience the reliance on Virtualbox (on Mac) was a little clunky and annoying, and I really wished for native support without Virtualbox. I'm super happy to see that's here! ~~~ gtirloni I think it makes sense to depend on the "native" virtualization solutions for each operating system (Hyper-V and xhyve). We have been using Vagrant and VirtualBox heavily and the new Docker for Windows/Mac is making us reconsider that since you can't easily use more than one hypervisor on the same dev machine without some hassle. We might be building our Vagrant boxes for these other hypervisors soon. VirtualBox still seems easier to work with but there isn't anything much exciting happening with it lately. Let's see... ~~~ amorphid I believe xhyve works fine with recent versions of VirtualBox, since xhyve is a pure userland app (aka no kernel extensions). Check out the issues section in the xhyve readme... [https://github.com/mist64/xhyve/blob/master/README.md](https://github.com/mist64/xhyve/blob/master/README.md) ~~~ sigjuice xhyve needs to interact with all kinds of low-level things so there has to be kernel code involved. xhyve does not install kernel extensions of its own like VirtualBox or VMWare Fusion. xhyve uses the kernel extensions provided by Apple (com.apple.driver.AppleHV and possibly others). ~~~ amorphid Ah, that makes sense. It doesn't install anything extensions of it's own, only using the kernel bits provided by Hypervisor.framework. ------ girvo Absolutely love Docker.app, it's made life so much simpler at work for all of us, and performance has been increasing steadily (though it's still not 1:1 with boot2docker-xhyve). ------ voltagex_ On Windows, Hyper-V doesn't really play nicely with laptops. If you've got it enabled and bridged to your wifi adapter, Windows 10 may think that your connection is Ethernet and turn off all bandwidth saving features. I only found out after Windows Update had exhausted my monthly LTE quota. ~~~ drdaeman Speaking about Windows, it is also disabled on Windows 10 Home and only available on Pro edition. Hope they'll maintain VirtualBox support as a first- class citizen (well, given that it was the most mature option during the beta period, suppose they will). ------ Perceptes Basically echoing senex's comment, but this announcement seems bizarre in light of [https://forums.docker.com/t/file-access-in-mounted- volumes-e...](https://forums.docker.com/t/file-access-in-mounted-volumes- extremely-slow-cpu-bound/8076). In particular, a Docker employee responds with a status update in [https://forums.docker.com/t/file-access-in-mounted- volumes-e...](https://forums.docker.com/t/file-access-in-mounted-volumes- extremely-slow-cpu-bound/8076), saying this isn't resolved for stable Docker for Mac. It's totally unusable for Rails development right now. ~~~ nzoschke The 'convox start' dev environment enables Rails dev on Docker for Mac with a custom file sync strategy. This is another case of simple solutions win... You can effectively rsync code changes without all the low level file system madness. [https://convox.com/blog/bidirectional- sync/](https://convox.com/blog/bidirectional-sync/) ~~~ Perceptes Thanks. There are some workarounds posted in the thread I linked, too. Frustrating that Docker for Mac doesn't just work for the main use case (local development), though. ------ spilk Can I run my usual VirtualBox VMs I have running for everything else (non- docker related) alongside Docker for Windows yet? When I tried one of the betas it enabled Hyper-V which prevented me from using any of my other VMs. ~~~ jimlei Wondering about this myself, going to write up a tutorial soon for people I know who use Windows. Last time I tried Docker for Windows it broke VirtualBox completely (just had to disable Hyper-V). Might be an easy fix for that though, didn't spend any time investigating ------ ben_jones Can I expect a dockerfile that 100% works on Linux to 100% on Mac and Windows? ~~~ ktzar Short answer, yes. Medium answer, images your Dockerfile is based on are still run in a Linux environment, even if it's virtualised differently. ~~~ adamhepner Wait now, hold on a minute. I'm very confused and curious how does this work? Can I run any linux-based container on Windows? Can I run (are there any?) windows-based containers? If so, does it work the other way around: windows container on linux host? Does it somehow use the recently published Linux Subsystem for Windows, or is it completely different compatibility layer? If it is different, doesn't it seem like a waste of effort? ~~~ jimlei > Can I run any linux-based container on Windows? No, on windows you still have to run a Linux vm which the containers will run inside. Meaning all containers actually run on a Linux host. The new Docker for Windows app only abstract away some stuff so it feels easier working with. > does it work the other way around No ~~~ mh-cx > No, on windows you still have to run a Linux vm which the containers will > run inside. I don't think, that's correct. To me that's the whole point of having a native Windows / Mac version of docker. From their feature list: > Faster and more reliable – native development environment using hypervisors > built into each operating system. (No more VirtualBox!) ~~~ drdaeman No, GP got it right. The quoted part is that instead of VirtualBox one can use Hyper-V. In either case, it's handled by docker-machine which runs a GNU/Linux VM with Docker (host) tools installed, and containers are ran on that VM. I would be surprised if there aren't plans to support WSL (to run Linux- targetting binaries on Windows "natively", thus have "native" Docker containers) but don't think that's available yet. ------ sz4kerto I still can't bind mount a file in a container if that file already exists in that containter. Is this production ready? ~~~ avsm This seems to work fine for me: docker run -it -v /private/etc/passwd:/etc/passwd alpine sh (not recommended for any actual use obviously) Is there a particular case in which this failed for you? We'd appreciate a bug report on [https://github.com/docker/for- mac/issues](https://github.com/docker/for-mac/issues) (or from the Docker for Mac GUI, just click on "Diagnose and Feedback") so we can chase down whatever issue you're having. ~~~ sz4kerto Yes, this use case, it happens on Windows and on Mac as well. C:\Program Files\Docker\Docker\Resources\bin\docker.exe: Error response from daemon: oci runtime error: rootfs_linux.go:53: mounting "/var/lib/docker/aufs/mn t/90d24356afdeb7b9ddad4b3b6903be92063151c33bf34f3d63ede464437060c6/cryptoservice/broker- config.yml" to rootfs "/var/lib/docker/aufs/mnt/90d24356afdeb7b9ddad4 b3b6903be92063151c33bf34f3d63ede464437060c6" caused "not a directory". (I'm mounting broker-config.yml and that file is already present in the container. Most recent Docker for Win beta in this case, but getting the same on non-beta Docker for Mac.) ~~~ seeekr The error message specifically says "not a directory" and afaik you can't mount single files, only directories. I at least have never even thought of trying to mount individual files since the bind mounting functionality in Docker seems to always and everywhere have been described in a way that suggests that it's for mounting directories, not individual files. ------ Scorpiion An interesting fact I think is worth mentioning is that Docker for Mac uses a forked and currently closed version of xhyve, and not the same xhyve that we can find on Github. The last commit to open source xhyve was May 27. With that said Docker has plans to open source it, I wonder if that will happen soon as they declare Docker for Mac ready for production. That would imply that the xhyve port also should be ready be contributed back or spin out into a new project (the quote below said they were not sure if they wanted to contribute back or make a new project). Personally I think the "right thing to do" would be to contribute back to xhyve, at the same time I have a feeling it's more valuable for Docker Inc to "own" and control their own fork/project so I would guess they will go down that path instead (it would still be open source, just under a different project name). Source: [https://news.ycombinator.com/item?id=11356293](https://news.ycombinator.com/item?id=11356293) EDIT: I stand corrected, see talex5 comment below, I had missed the hyperkit announcement. ~~~ talex5 I think this is what you're looking for: [https://github.com/docker/hyperkit](https://github.com/docker/hyperkit) Source: [https://blog.docker.com/2016/05/docker-unikernels-open- sourc...](https://blog.docker.com/2016/05/docker-unikernels-open-source/) ~~~ Scorpiion Oh, my bad, thanks for the correction! :) ------ slantedview I need nested virtualization as well. I don't know if this is possible with the hypervisor being used, but it's hugely important for me. ~~~ mintplant Curious, can you share what you need it for? ~~~ slantedview Basically for hacking on cloud software that runs hypervisors such as kvm. docker-machine with a VMWare Fusion VM and nested virtualization enabled is the current approach I use - works fine for now. ------ HerpDerpLerp "This version of Docker requires Windows 10 Pro, Enterprise or Education edition with a minimum build number of 10586. Please use Docker Toolbox." :( ~~~ ZenoArrow That'll be because Windows 10 is the first desktop Windows OS with support for Hyper-V. If you didn't want to use Windows 10, perhaps you might have some more luck with a Windows Server OS. Does anyone know if the latest version of Docker will work on Windows Server 2012? ~~~ friism Client Hyper-V actually showed up in Windows 8 [1], but only recent versions of Windows 10 have Hyper-V with all the features needed by Docker for Windows. [1]: [http://www.howtogeek.com/196158/how-to-create-and-run- virtua...](http://www.howtogeek.com/196158/how-to-create-and-run-virtual- machines-with-hyper-v/) ------ geekbri My biggest issue with it is that there seems to be no easy way to provide more space to the VM docker runs inside. While it seems trivial it can be useful if you happen to have really large images (yes, for valid reasons). If you run too many containers you just run out of space. ~~~ cptskippy I ran into this too when fooling around with SyntaxNet. It hink the default size of the VM is 16GB. I kept running out of swap space even after bumping the ram on the VM up to 16GB. ------ turnip1979 Does this thing work on Windows 10 home edition? That doesn't have hyperv I think. ~~~ kristianp No it won't work, you're right it seems: [https://msdn.microsoft.com/en- us/virtualization/hyperv_on_wi...](https://msdn.microsoft.com/en- us/virtualization/hyperv_on_windows/quick_start/walkthrough_compatibility) ------ amq I've been trying to switch from VirtualBox to Hyper-V twice to use Docker for Windows, but always hit the same wall when using a desktop Linux guest: no 3D acceleration, no resolution scaling, no shared clipboard. ~~~ lewisl9029 Yes, Hyper-V's Desktop UX for VMs is still in a _really_ sad state compared to the competition, even for Windows guests, let alone Linux ones. I have it enabled to use Docker for Windows by default, but very often still need to reboot to disable it and use VMware Workstation for any serious work inside VMs, for all the reasons you listed, plus the awesome multi-monitor support in Workstation. Microsoft really needs to get its act together. ------ sheraz Good to see they are moving forward, but I have a working rig at the moment with virtual box. "If it ain't broke, then don't fix it" is a motto I live by. Better it's your blood on the bleeding edge rather than mine :-) ------ jamespacileo Docker for Windows is unfortunately a bit gimped right now: \- docker-compose isn't OS agnostic and as versions go forward Windows is lagging behind \- this uses Hyper-V which blocks both Virtualbox and Vmware from running ------ cowmix I love Docker for Mac but I have had a problem of containers just disappearing after running for a while. ~~~ lahdekorpi It has a pretty good logging, have you taken a look if it crashes or something? ------ saysjonathan If you have issues with the Docker for Mac on Sierra, turn off ntp: sudo launchctl unload /System/Library/LaunchDaemons/org.ntp.ntpd.plist If I see that "We are whaly happy to have you" welcome screen one more time though... ~~~ superchink Can you actually live without NTP though? Isn't that a pretty critical service? ~~~ zbyte64 One of the issues is that if your laptop goes to sleep your linux container becomes out of sync on the time. To fix this you have to restart docker. ------ johnchristopher Last time I checked it didn't support 32 bits but could be tested on Windows 7. Now I see it requires `Microsoft Windows 10 Professional or Enterprise 64-bit For previous versions get Docker Toolbox`. ~~~ cptskippy Yeah, that's part of why this release baffles me. From 1.11 to 1.12 they dumped VirtualBox in favor of the latest version of Hyper-V which why it only supports Windows 10. That's a pretty significant change in my mind but it didn't seem to extend their testing/validation timeline at all. ------ jaequery Anyone know if the volume ozone performance have been improved to at least the level of nfs method? This is what keeps me away from using the docket for Mac and just sticking to dinghy until this is fixed ~~~ Axsuul Would like to know this as well. Docker for Mac performance is still horrible. ------ AjithAntony ARGH! I can't use Hyper-V on this hardware (no SLAT support). ------ esseti is anyone having troubles with docker-compose (on mac)? It seems that ports are not forwarded/opened. ------ xutopia I'm finding it really funny how even Docker users can't really explain easily what Docker is. ~~~ alex- Docker is a process launcher that makes it fast and simple to start processes with a unique network/filesystem/process/user space (via cgroups and namespaces). ------ cybernazi It won't work on Windows 7, 8. It needs Windows 10 pro or enterprise to work. ------ cyzhu Greate! It's hard to install it before.
{ "pile_set_name": "HackerNews" }
Palindromic Place Names - Amorymeltzer http://www.cntraveler.com/stories/2015-07-20/the-only-reason-to-visit-this-remote-australian-spot-is-its-name-palindrome ====== OedipusRex This is the exact thing I would expect Ken Jennings to write about.
{ "pile_set_name": "HackerNews" }
Change default gender in the dining philosophers project - engintekin https://github.com/rust-lang/rust/commit/b748c2e90d87985fbff1d99e17d94a10cf2b3f21 ====== Fud4Thought "The paper from which this example was taken made the mistake of assuming that all five philosophers are men." And some mistakes are unforgivable.
{ "pile_set_name": "HackerNews" }
Arrington: 'Demo needs to die' - edw519 http://www.news.com/8301-13772_3-9909841-52.html?tag=nefd.top ====== icky > "I think that certainly...TechCrunch 50 is a great venue for young > companies," Shipley said by phone from Madrid, Spain. "But to put it up > against Demo means those companies are now going to be competing for > attention, and I just don't see how that's good for entrepreneurs." Actually it gives entrepreneurs value by letting them choose between spending $0 for, say, half the attention formerly held at Demo, or by paying Demo's fees for half the attention formerly held at Demo. Sounds great for entrepreneurs, terrible for Demo. ;-) > Still, she suggested Arrington's assertion that Demo needs to die is > unfortunate. > "I'm not certain why it must die," Shipley said. "I think that there is a > lot of room in the market for products and services that support > entrepreneurs. And I don't see how it's a benefit to (the entrepreneurial > ecosystem) to kill off a platform that's all about supporting > entrepreneurs." Hey, that's exactly the opposite of her previous argument! What is the word for when a hypocrite hypocritically uses different standards for different people in similar situations, in a hypocritical manner? Oh, yes: _hypocrisy._ ~~~ skmurphy I don't read the statements are a contradiction. I read "to put it up against DEMO" as meaning "in the same calendar slot." If DEMO does support entrepreneurs--which it does--how does having more choice hurt? If you don't like their fee structure, then apply to TC50. Chris Shipley has been in the industry for a while and has helped a number of startups: I don't read her statements as hypocritical or believe that she is a hypocrite. If DEMO were really moribund Arrington wouldn't be able to attack it to get attention. From a YC/Hacker perspective, even if you prefer the TC50 model to DEMO, I don't believe that anyone is well served by having them collide in the same week. ~~~ icky > I don't read the statements are a contradiction. Perhaps I was reading too much malice into it, but I came away with the distinct impression that had Demo been muscling in on TC50's weekend, her arguments would have been reversed. ------ wumi proof that sensationalist titles win out: <http://news.ycombinator.com/item?id=153604> ~~~ rms Yes, in the general case, but that quote is also by far the most interesting part of the article ------ redorb Techcrunch50 (proof that profit wins out) ... it was TC20, (then moved to TC40) then added a demo (pay for play) pit, then added sponsors. Im sure Mcdonalds thinks Wendy's should die.
{ "pile_set_name": "HackerNews" }
My Secret Weapon to Getting Contracts - jmorin007 http://www.smashingmagazine.com/2008/08/12/my-secret-weapon-to-getting-contracts/ ====== astine I found out in college that I made more friends when I went out of my way to introduce myself to people. Business works in a similar way. ------ hhm That's good for starting, but not for big contracts, as it shows you as an idle developer, and that isn't good for bargaining in my opinion. ~~~ swombat _it shows you as an idle developer_ Not really. Not any more than cold-calling does, anyway. It shows you have spare bandwidth. That doesn't mean you're idle, it just means you have some room to do additional work - which, if they happen to need something, is just peachy perfect. ~~~ hhm Yes, it might be true. I guess it depends on how you present yourself for doing the sell. ------ utnick Anyone tried to do this with programming contracts instead of design contracts? ~~~ tocomment1 I'm curious too. It would be harder to look through the phone book to find companies with bad programs :-)
{ "pile_set_name": "HackerNews" }
How I discovered my love for JavaScript after throwing 90% of it in the trash - haburka https://hackernoon.com/how-i-rediscovered-my-love-for-javascript-after-throwing-90-of-it-in-the-trash-f1baed075d1b ====== igemnace `const` doesn't mean immutable data. It means immutable variables, i.e. you can't reassign to `const` variables. You can mutate the data regardless (push to an array, add a new key to an object), unless the variable points to an immutable value to begin with (primitives, or actual immutable data structures such as with Immutable.js). ------ empthought Replacing ifs with ternaries and switches with cond are hardly significant or important changes. The whole point of abstraction (functional or object-oriented) is to push that logic from control structures in code to composition structures in runtime data. If you weren't replacing conditionals with polymorphism, you weren't doing object-oriented programming in the first place. ------ satori99 > The for loop is now completely extinct in my codebase. If you do happen to > stumble across one, point it out so I can kill it. For loops in JS are _significantly_ faster than .forEach() or .map() When performance matters (WebAudio, WebGL etc), you can't easily get rid of for loops just yet. ~~~ fenomas This isn't a concern unless you're tuning performance and the loop is a bottleneck. forEach _can_ be slower (if the JS engine doesn't inline the function body), but not to the extent that you need to structure your code around it. ~~~ satori99 Yeah, I agree with that. Which is why I mentioned performance sensitive stuff like WebGL. Personally I use the functional methods whenever I can, but the original quote I re-posted indicates that the author thinks that for-loops should always be replaced with their functional equivalents, and I was just pointing out that this is not universally a good idea yet. ~~~ fenomas Yeah, I agree - just adding a caveat to your caveat. Personally I found the whole article very skippable. None of it is necessarily terrible advice, but everything is overstated, and said without really explaining why its advice better than the alternatives. ------ kimi Suggestion: throw the remaining 10% away as well and go for ClojureScript. You are ready. ------ xkcd-sucks Is it really okay to do recursion in JS?
{ "pile_set_name": "HackerNews" }
Dang is a terrible moderator who engages in ideological censorship - bonetopick I&#x27;m sorry, but after trying yet again to contribute and yet again being slapped around I&#x27;ve finally had it.<p>Dang, you are the single worst moderator I&#x27;ve ever had the displeasure of having to deal with. You regularly ban, throttle, shadowban, and otherwise censor used for purely ideological reasons while claiming that their posts are &quot;unsubstantive&quot;. The entire structure of HN does inevitably lead to an echo chamber of people who are at times incredibly lacking in self awareness, but your actions only exacerbate this problem.<p>Anyway, this post is giong to be downvoted, flagged, and probably deleted with the account banned and maybe a weak redirection to an email account. God forbid we discuss the state of the site and moderation publicly. ====== dang Why don't you supply links so readers can make up their own minds? It sucks to get moderated, and often feels like the mods are doing it because they disagree with you. Whatever your ideology is, though, I guarantee you there are users on the other side complaining that yours is the one we secretly agree with and theirs is the one we're censoring. ~~~ ReggieJJJ I think you may be underestimating this issue. It is leading to one of the most heinous and vile things someone can do on an internet forum, shadow banning, akin to putting them in an isolation cell; only worse worse, basically transporting them there without even letting them know they are/were put there and users keep commenting and contributing while being essentially locked away and lost in an authoritarian prison cell where no one can hear them. It's downright sick and depraved and one of the worst practices of the tech sector today for which there is zero justification regardless of what unapproved things were said. There is a real sickness on the net today where "moderators" have come to think of themselves as the thought police, but even worse, not just thought police of an authoritarian regime, but a kind of self-anointed little dictator that rules over the user rabble at will and upon whim. If you want to silence people, at least have the spine of any other run of the mill authoritarian and tell users their fundamental human right to speech and expression was taken from them … don't just silently transport them to digital purgatory. Again, it is a vile act that only the most depraved people would excuse, let alone endorse, no matter what unapproved things were thought and said … which are only words/characters on a screen that can be dismissed and ignored. Ok, I've said my peace. It's a sick and depraved practice and I cannot stand by just not say something when the topic comes up while digital regimes like HN, reddit have, what are essentially digital version of hellish totalitarian regime isolation cells that "mods" disappear people/accounts to. ~~~ dang We don't typically shadowban accounts that have much history on HN; we tell them we're banning them: [https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20banned&sort=byDate&type=comment). We shadowban spam accounts and new accounts that are egregiously and/or repeatedly breaking the site guidelines, especially if they show evidence of being serial trolls. That's a fundamentally different case—there are many users who simply create new accounts as we ban their older accounts and carry on posting as before. It isn't a good use of resources to patiently coax them to improve or even to reply to their posts at all. These users know perfectly well when we've banned them; it's a cat and mouse game and not at all an "isolation cell". It's certainly true that there are a few good-faith users who get shadowbanned because we mistook them for bad-faith users—people who didn't realize that they were breaking the site guidelines and could have reformed if we had explained and asked them. But there are orders of magnitude fewer of these than your comment implies. I'd prefer that there not be even one such lost lamb, but it's just a hard problem to solve. Software can't distinguish these and moderators can't read intent correctly in every case. ~~~ ReggieJJJ And yet, here you are, validating the very criticism of your heavy handed authoritarian nature. You make the very same kind of authoritarian claims of "evidence" (where is it, show it and provided it for public scrutiny and audit then) and you weigh that "evidence" against arbitrary (yet comically inline with your own self-interest) measure against ambiguous rules. It's the authoritarians' playbook 101, regardless of whether the authoritarians are competent enough to recognize it, let alone self-are enough. (pro tip: you guys never are … never) Just try to reflect on the authoritarian and supremacist mentality you have with language like "It isn't a good use of resources to patiently coax them to improve or even to reply to their posts at all"; yet your expend resources to censor and control and rule with an iron fist anything that is not to our liking. No one is talking about obvious spam or any kind of illegal language like credible threats of violence or stalking or bulling, but if you were at least honest with yourself you would acknowledge that it has nothing at all to do with "good use of resource" and all to do with control. Because it is irrefutably true that if it were about "good use of resources" you would wholeheartedly avoid censorship of legal language and not violate what what has been deemed a human right, free speech. You types, people with the authoritarian supremacist mentality you clearly have, are always quite to rationalize why it is fine that you be the arbiter of right and wrongthink, but it never avails you of the inherent evil exhibited by controlling speech and expression, no matter how much beneath you or inferior and unworthy you decree that speech to be. ~~~ dang You're arguing for a style of moderation that has never applied on HN, because this isn't that kind of site. Rather than complaining about that here, it would be in your interests to find a different forum that is more aligned with the kind of moderation you want to see. There's room for lots of different kinds of internet forum. It's good for different sites to make different choices about how they operate—that gives users get a richer set of communities to choose from. The thing to realize, though, is that there are tradeoffs. For example, you can't both have a site that's dedicated to intellectual curiosity and allow aggressive comments and flamewars. If you try, the flames will burn out the curiosity. That has significant consequences for moderation on a site like HN, where curiosity is the core value. In order to preserve HN for its intended purpose, we have to respect that reality. It isn't "authoritarian supremacist mentality", or any of the other things you describe, to want to maintain a particular kind of internet forum. If your job was to maintain a public garden, you wouldn't allow drag racing in the flower beds. ------ DoreenMichele _The entire structure of HN does inevitably lead to an echo chamber_ Group dynamics always and consistently create situations where "popular" views are much easier to express. This is not _caused_ by moderation and not unique to HN. In fact, good moderation should give some pushback against such which makes it less awful to participate in good faith while disagreeing with the majority view. My first-hand personal experience is that HN is generally better than most forums about allowing for and supporting disagreement with the popular views on the site, so long as you follow the rules concerning civility and the like. I find group dynamics fascinating. It's unfortunate that moderation of online forums is one of the best real world "labs" for such an interest. It inevitably gets interpreted as me sucking up to the mods here rather than me simply finding the topic interesting and desiring to participate in the discussion in good faith. I keep trying to find other outlets, but they tend to fail to satisfy. ------ ThrowawayR2 Personally, I like the moderation here. It strikes the difficult balance between allowing discussion of sensitive topics while preventing the forum equivalent of descending into all-out verbal Armageddon about as well as any humans possibly could. Bending to whatever ideology bonetopick wants to push, be it right, left or otherwise, would probably get equally angry posts from others so I encourage the moderators to give this post the minimal attention it deserves. ~~~ dang We tend to leave posts like the OP up. This one was killed by a software filter as well as heavily flagged by users, but I turned all of that off. That way we don't get accused of killing it for sinister reasons, but also the occasional thread like this can act as a pressure valve to air grievances and give us a read on how the community is feeling. My sense is that the bulk of the community does not share the feeling that moderation here is so biased or vindictive. But if that's wrong, we definitely need to know. ~~~ AnimalMuppet The only thing worse than moderators is the sites with no moderation. I've been watching a site I liked turn into a venting ground for mentally ill people, trolls, and people with a strange, obsessive hatred of one particular person and everything he does. It's rather unpleasant. So thanks for all you do. You get lots of complaints (nobody likes being on the receiving end of what you do), but the site is _far_ better because of you. ------ SamReidHughes Please point out the first substantive parent comment you see at [https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...](https://hn.algolia.com/?dateRange=all&page=0&prefix=true&query=author%3Adang%20unsubstantive&sort=byDate&type=comment) ~~~ OJFord Edit: Nevermind, I misunderstood parent comment completely. \--- What's your point? The search term is ensuring you only find moderation-hat comments, and _of course_ they're not going to be contributing with substance of their own to the topic being discussed. ~~~ yorwba "substantive parent comment" for that search, i.e. a comment that dang called unsubstantive but that is actually substantive. If such a comment existed, it would lend credence to the accusation that dang was wrongly calling it out for purely ideological reasons. ~~~ OJFord Oh I see, sorry! I thought you were saying 'look at all these comments of dang's, calling out other people, but not a single one of them is subtantive itself'. ------ aguilar I've been reading top HN posts for a few years now and recently started reading a little more of the comments. My personal impressions are that here we have a good and precise moderation. What makes me think this way is that I mostly remember reading deep, reasonable, authentic and specific comments and discussion, even in long lists of comments - different from some other forums / channels where it might be more common to see hateful, toxic comments. Is it a good moderation or the public here is more polite? Or both, maybe? What I see in this post here is a lot of offensive accusations and supposed constatation, without examples, evidence or even a narrative of facts that led to this rage and indignation... In my perception a clear example of post that is not useful, not following the guideline and doesn't promote any improve. I feel sorry you feel this way about the moderation here, but I suggest you deeply analyse yourself first. ------ yesenadam You provide no evidence at all. Like a single link to what you're talking about. (Why is that?) You say you've had it, yet here you are still. There seems quite a lot of discussion of the "state of the site and moderation" on here, but you act like it's forbidden or something. ------ masonic We apologize for the fault in the moderation. Those responsible have been sacked. ------ aww_dang I rarely agree with his stance in regards to political moderation. On the other hand I don't envy his position. It is easy to be critical, but harder to put yourself in his shoes and suggest improvements. Overall I think he wants to moderate for the most sensitive users of a particular persuasion. I view HN as a product of Silicon Valley's culture. The end result is that I self censor when I disagree with what I perceive as a the prevailing view on HN. Take it or leave it. Discussing political views online isn't an especially fruitful use of my time. Occasionally there are a few interesting articles. If I need a diversion I focus on the ones I'd like to read. Commenting is rarely worth it in my view. ~~~ danbolt What do you find you self-censor the most on? Perhaps not going in-depth, but I'd be curious on what you notice you feel isn't a good fit for the site. ~~~ aww_dang Off the top of my head, Climate change, China, Hofstede, the primacy of the state and bicycling oddly enough. The first one I won't touch anywhere online. China has too many wumaos. There's very little room to question the role of the state without things devolving rapidly. As it concerns cycling, suggesting that people can get out there as self- starters without the need for designated lanes had me warned for being impolite? Not sure how much gentler I could have said it. I few comments later someone else was suggesting that I was an outlier as a former bike messenger and that most couriers are run-over so my opinion isn't valid. Somehow this one passed by the moderation team. I get that the moderators can't be everywhere at once. Not here for Internet points. I want to be polite with everyone. But I view the pedantic wrist slaps as falling on one side and ignoring the other. ------ agumonkey odd how different people have wildly different experiences on the same website
{ "pile_set_name": "HackerNews" }
Improve computer security by removing unwanted services - ihackforfun http://www.ihackforfun.eu/index.php?title=improve-security-by-removing-services ====== gforces These commands don't work on a Linux mint install. Pointers anyone? ~~~ ihackforfun You could try this link from the mint community as a starting point: <http://community.linuxmint.com/tutorial/view/114> and also this article: <http://forums.techarena.in/operating-systems/1384637.htm>
{ "pile_set_name": "HackerNews" }
Why Google stores billions of lines of code in a single repository - astdb http://delivery.acm.org/10.1145/2860000/2854146/p78-potvin.pdf?ip=148.182.26.69&id=2854146&acc=OA&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E5945DC2EABF3343C&__acm__=1562623471_ec501a0b1907e9ce1ae6fb9751d6b541 ====== rootshelled Your link produced an error: "An error occurred while processing your request. Reference #50.7f00655f.1562624656.56db9bd" Just thought you should know. In the meantime I think this is the article: [https://m-cacm.acm.org/magazines/2016/7/204032-why-google- st...](https://m-cacm.acm.org/magazines/2016/7/204032-why-google-stores- billions-of-lines-of-code-in-a-single-repository/fulltext) It seems to be from 2016 ~~~ RandomGuyDTB Submitted before, too: [https://news.ycombinator.com/item?id=17605371](https://news.ycombinator.com/item?id=17605371)
{ "pile_set_name": "HackerNews" }
Goodbye from Linux Action News - gerty https://linuxactionnews.com/153 ====== gerty One of my favorite podcasts just gone like that. Anyone knows the backstory with acloudguru?
{ "pile_set_name": "HackerNews" }
The Internet of Things – A Disaster - stock_toaster https://gekk.info/articles/iot.html ====== noonespecial With apologies to the late Mitch Hedberg, always design your device to fail like an escalator, not an elevator. When an escalator fails, it becomes stairs. When an elevator fails, it becomes jail. Don't design a computer that is a smoke detector, design a smoke detector that happens to also have a computer. A kernel panic or complete loss of computing should leave behind a traditional smoke detector in a fancy looking case. ~~~ dechov As an exercise of the limits of this advice, how does this apply to designing elevators, short of "never design elevators"? ~~~ ouid hard brakes that apply when doors are opened manually, and an emergency panel behind which a small ladder is located. ~~~ jack9 The costing difference between an elevator and stairs is substantial. Especially when you have to have stairs when you get an elevator by building code. ------ com2kid There is a battle between the old school people who want to do everything with microcontrollers and who measure RAM in Kilobytes, and the new generation who want to embed a full microprocessor + OS in everything. When it comes to capabilities, the people arguing for entire OSs generally win. Go with Linux, get an HTTPS stack, get XML parsing, get JSON parsing, don't worry so much about RAM. And heck, the price different between a microprocessor and a microcontroller is minimal now days, a couple bucks in large quantity (especially if you have to outfit the microcontroller with more chips to add functionality). The microcontroller people are right in one way though. Their products are simpler. Battery life will be better (it is easier to control power usage when you start with nothing and build up, then trying to minimize an entire OS down) and the code will be purpose built with less complexity to it. The ESP8266, ESP32, and Arduino are one view of things, Raspberry Pi and "Things with Android Running On Them" are another. Of course against all of this, the analog EEs laugh at us and continue to make things that just /work/. ~~~ wott I was hired to design embedded electronics in my last job. Came a project. I would have designed it with one microcontroller in the 50-120 MHz range, or two of those to make it easy to deal with all interfaces, and called it a day (on the electronic side) because almost nothing more was needed. It was basically just a glorified RS485 extender + Web -> RS485 interface. But our boss decided of the design: mandatory Linux and Python scripting, so we ended up with a Big Mama Ubuntu on a 1 GHz dual core processor with 512 MB DDR2 RAM... Because yeah, on the software side, you 'just' have to throw ready-made drivers and link pieces of ready-made software developed for the desktop. Supposedly... The board ended up with 1500 components, it took 1 extra year to get just half of the features kinda working, the rest of the features was discarded. And the power budget was blown, which probably led to interesting negotiations with other involved bodies, but I had already quit at that time. Obviously, 'embedded' meant very different things for me and for him. ~~~ joshvm It's fun being on the other end of the scale - the last major commercial project I did used an Atmel Atmega1284. Most other systems in this space go down the route of FGPA which is hell to support. I was actually amazed we got away with an 8-bit micro, but it was a great example of pushing the chip to its limits and comfortably achieving the goal of the project. ------ _iyig I worked at Nest. This blog post is full of BS. For example: "For the record, I read several stories about the Nest Protect going into permanent alarm, and you know what my hunch is? The same thing I always assume: "Dumb Linux crap." The culprit was probably some shell script that opened /opt/smoked/detect and output 1 to it and then left the file locked so nothing else could touch it or forgot to delete a pid file or whatever. This is what I always assume when I read about Linux integrated devices screwing up, and on the occasions I've actually heard what the cause was I usually end up right." As I mention in a separate comment, iFixit's teardown identifies the 100Mhz, 128kB RAM MCU which is the Protect's brain. Such a device does not run Linux. Had the author done any research at all, they would have found this. Instead, they waded into an unfamiliar topic with the knowledge of a novice and the arrogance of an expert - precisely the accusation they level against Nest. Furthermore: "The Smart device engineer does not begin by disassembling ten smoke alarms to see how they work. They do not begin by reading papers written by fire chiefs and scientists. They do not look at the statistics on fire-related deaths with and without smoke alarms of different eras (although the marketing department director does)" All this due diligence and much more was done. The author's lazy speculation insults me and my former co-workers. This blog post gets lots more important stuff wrong. Suffice it to say that today the Protect is very well-rated by consumers and safety professionals. The IoT field as a whole is a mess, and deserves much of the author's criticism. Nest, specifically, does not. ~~~ zkms > All this due diligence and much more was done. I don't have any connection with Nest Protect and don't own one, but the fact that the Protect does not use an ionisation sensor (they suck because they suck at detecting fires early, it has nothing to do with the harmless radioactivity) and used a dual-wavelength photoelectric sensor gives them high marks in my book. All the hardcore smoke detection equipment -- both the aspirating kinds ("VESDA") and the open-space beam/imaging kinds -- use dual-wavelength sensing. I cannot speak to all the "smart"/"connected" aspects of the device but the smoke sensor seems far better than that of every other residential smoke detector, and it would be very nice if other residential smoke detectors ditched the ionisation sensor and went to a dual-wavelength design. ~~~ makomk Most new smoke detectors are photoelectric rather than ionisation-based these days because China can pump basic models out for $2-3 shipped in quantity one. (In fact, the cheapest I've heard of them going for is £1 for a two-pack in a UK discount store recently.) Nest got a bunch of flack because apparently they screwed up their original single-wavelength implementation and had obnoxious problems with false alarms. ------ komali2 I don't think a poorly designed and engineered product, that _happens_ to connect to the internet (and to be fair is marketed as "IoT") means that the entire concept of "connecting devices to the internet" is "bullshit." I agree that there's a lot of marketing Bullshytt (to steal from neal stephenson) regarding IoT, but when we get to the root of what the words really mean (connecting stuff to the internet), I think it's the natural progression of technology. Why _wouldn 't_ you connect your field of oil pumps' valve readings to the internet (as well as having the manual needle- gauge backup) so you don't have to have a dude drive out every day to take readings? Why _wouldn 't_ you connect your thermostat to the internet so you can override the schedule when it turns out you need to stay late at work? Etc. I don't get why everyone has to lump together the marketing bullshit that obviously is overstated with IoT, and then go "hence, IoT will fail and is crap." ~~~ rubatuga The author mentions that he would like IoT for his washing machine, so I think you’re misrepresenting his sentiment. Most likely, he’s angry with the current engineering standards of IoT, more specifically Nest ------ 3chelon The author makes a very valid point about people replacing embedded systems with Linux. I have a home automation system that has been running for over a decade now, and has several generations of technology in it. The order of failure is, from highest to lowest:- 1\. the desktop I receive status updates on; 2\. the Raspberry Pi I added to monitor the remote nodes and email me status updates; 3\. the batteries; 4\. the physical parts: motors, etc; 5\. the embedded systems that read the sensors and control the motors; 6\. the analog circuitry. ~~~ mcbruiser3 do you happen to have a writeup of your HA system and how you put it together? ~~~ ocdtrekkie Not the parent, but I will probably be writing up mine soon. My choice of hardware is heavily inspired by the notion that devices function independently of the computer, and the computer functions independently of the Internet. ...My home automation software has bugs, and I've had plenty of problems with it. But I've yet to have a problem with my thermostat, lights, or smoke detectors. Despite all of them being linked to it. ~~~ mcbruiser3 OK, great. Will look for a reply with a link. ------ philipkglass I first read about Internet of Things in contexts where it seemed to make a lot of sense: monitoring bridges and pipelines for integrity, "smart grid" coordination to optimize electrical grid services, networked sensors for better preventive maintenance and higher productivity from existing capital equipment (e.g. mining facility trucks, crushers, power plants...) It's sad that the term has come to be dominated by ill-conceived consumer devices. I still have to remind myself that people are picturing "waffle iron that sends email" rather than "networked strain gages on bridges" when they savage IoT as a concept. ~~~ azeirah I've recently come to believe the same thing, always shrugged off the weird consumer IoT stuff because of the marketing, security issues and uselessness of a lot of the projected visions of the IoT future. I lose confidence in my rejection of IoT when I think of smart energy grids and cooperating sensor-networks. It's a useful and welcome addition to our future. ------ ChuckMcM An interesting read, a bit more inflammatory than I find informative but the points are pretty solid. Realizing a new device, especially a safety critical device, is something to be done with care (he rants about the Nest smoke detector). And it is "easy" to get irrationally exuberant about adding a computer to things that don't really need them. But I'd diverge a bit on claiming the _practice_ flawed rather than simply the application of it in this case. ~~~ Animats That's what bothers me about IoT projects - little consideration of failure modes. Nest thermostats have failed in modes that resulted in houses having no heat, and the user couldn't override it locally.[1] Combine this with active attacks, and it looks really bad. Over three weeks after the attack, Maersk Lines is still struggling. Their big ports didn't achieve close to normal operation until about last Monday. Their big automated port in Rotterdam is running again, after being totally shut down for two weeks, but there's much more manual paperwork than usual, and billing is completely down, so they have zero revenue.[2] LA and Elizabeth NJ are finally back up. Some customer-facing functions were completely re-implemented with simpler web sites. Container tracking is still down. This is the world's largest shipping line, 24 days after the event. [1] [https://www.nytimes.com/2016/01/14/fashion/nest- thermostat-g...](https://www.nytimes.com/2016/01/14/fashion/nest-thermostat- glitch-battery-dies-software-freeze.html) [2] [http://www.maersk.com/~/media/20170629-operational- update/20...](http://www.maersk.com/~/media/20170629-operational- update/20170714_5pm_nwc_faq.pdf?la=en) ~~~ pmarreck The Nerves project ([http://nerves-project.org](http://nerves-project.org)) is an IoT OS which basically boots a device into Elixir, and benefits greatly from the battle-tested Erlang BEAM VM which is extremely fault-tolerant. Do you have a link to the Maersk story? My dad used to work in shipping. Also, my original Nests decided out of the blue one day that their power wires were both disconnected (which is impossible); is this bug related? ~~~ willtim Both Erlang anr Elixir, while safer than C/C++, are still far from ideal for any critical system, since they lack any kind of static typing. ~~~ pmarreck Citation needed, especially since 95% of the time your code is just dealing with some combination of lists (arrays in OO langs), maps, strings, ints, and symbols/atoms, and is easily unit-tested ~~~ Hasknewbie It's interesting that you ask for a citation, and then immediately proceed to essentially make stuff up ("95%", "easily unit-tested"). Maybe you could provide a citation for these? There is a long-running trend in the industrial sector to use statically-typed languages in large and/or critical projects, so there must be reasons for dynamic languages to be quasi non-existent there, and a change would require more than "just easily unit-test it", no? (To be fair Erlang is in fact the one notable exception, although it's -- as far as I know -- only used in the telecom industry) ------ justinsaccount > I can't buy something that just has a $5 microprocessor with just enough > intelligence to connect to the internet and send me an email or a push > notification if the buzzer on the washer goes off. 1000000 x this. The only thing I'd ever want a smart device to do is have a way to configure a mqtt or such endpoint that it should connect to to send/receive simple messages. Just no consumer actually wants this. They want to be sold a box that connects to someone elses mqtt broker so it can work with an app that connects to someone elses server. It's just another front on the war on general purpose computing. ~~~ astebbin > a $5 microprocessor with just enough intelligence to connect to the internet > and send me an email or a push notification if the buzzer on the washer goes > off You have essentially described the Protect. Its brain is a cheap little MCU with 128kB RAM. The author, had he read an actual teardown [0], should have realized this. Instead, he made false and confusing claims about Linux. Certainly other companies put grossly-overspecced hardware in their "Smart" devices, but Nest's Protect isn't one of them. [https://www.ifixit.com/Teardown/Nest+Protect+Teardown/20057](https://www.ifixit.com/Teardown/Nest+Protect+Teardown/20057) ~~~ wrs Linux specifically is wrong, but according to ifixit, there are two general purpose 32-bit processors, and two radio SoCs also containing 32-bit processors. Which to me still seems like a _heck_ of a lot of CPUs and software for a smoke alarm. ------ pishpash "I need to preface this by saying that I have very little faith in the worldliness or general sense of Silicon Valley hardware engineers. I have seen a long history of extremely poor decisions from that part of industry, so I will assume they made all the worst decisions." Old school engineers designed bridges, dams, nuclear plants, transformers, etc. They took a serious oath and got a license. Software "engineers" \-- and let's be honest, most are closet software engineers, even the "hardware" ones -- are not cut from the same cloth in approaching problem solving, to the major detriment of society. I never trusted them. ~~~ ew6082 As one of those engineers, I often wonder. In my line of work, if you don't have a stamp, you're not an Engineer. While NCEES offers a PE in software, how many in software actually take the PE exam? Is Professional Engineer even a thing in Silicon Valley? ~~~ astebbin As a Silicon Valley software developer, I haven't seen many credentialed engineers. The vast majority of my colleagues in software have a degree or two in Computer Science, but no formal engineering certification. The few I've met I would describe as "incidentally" credentialed, meaning they went to a school (commonly Waterloo) where it was a requirement for graduation. ------ _wldu A friend of mine likes to say, ___" The S in IoT is for security"_ __. ------ gumby Nest is doubling down on the dumbness too. Their nest cams ship _everything_ up to the cloud for processing! What happened to computing at the edge? Are Apple (of all people) the only ones to still embrace this philosophy? ~~~ jff Processors are cheaper than ever, and there are multi-core multi-gigahertz processors with multiple gigabytes of RAM scattered throughout most homes but yes, as you point out, nobody wants to use them, they want to ship them up to "the cloud" or as we used to call it, "some servers on the Internet". A charitable soul might say that by keeping local software simple, the end- user's devices don't need to be updated as often, while the software running on the cloud systems can be continually enhanced. A cynic like myself would say they prefer to use the cloud: 1) because it's easier to gather serious data about the users for later use/sale 2) because by tying users to an online service, they have something valuable ("2 million active users!") to offer when the startup inevitably gets bought out 3) because you can get away with less efficient code if you're running on a big timeshared server vs. on a small battery-powered device 4) because "the cloud!" is still an effective marketing gimmick ~~~ mahyarm As an ios mobile dev, I get jealous of my fellow android dev's ability to roll out updates incrementally and to publish updates immediately. With a cloud model, you don't have the possibility of bricking your customer's device as a failure mode. 1) You can actually gather more data on the device vs just a remote service 2) SaaS moats are attractive from a business model perspective 3) TBH pushing compute on your customers devices is cheaper for the cloud owner, but harder to manage. On a per user basis most servers are actually VERY efficient. Most peoples smartphones are relative supercomputers, only some things will be very battery draining. 4) 'the cloud' does enable a lot of stuff that people don't want to manage manually themselves ------ bsder As much as I like slagging the Internet of Crap, smoke detectors are not a great example because smoke detectors are remarkably complicated. They have to work. For years. They have to detect disgustingly small signals. Reliably. And, by the way, the chips have to survive in close proximity to a radiation source. And generate an alarm loud enough to wake the dead. If I remember correctly, Qualcomm nee NXP nee Freescale nee Motorola used to keep an old 2" aluminum gate fab line around because every time somebody tried to make a new smoke detector chip _something failed_. So, they kept the fab open and simply printed money. ------ lxe > Put it on top. Connect the I/O lines on your little PC to the output of the > smoke detector chip. Let it do the heavy lifting, the stuff that you aren't > sure you can do safely and which it has proven it can do for decades. I worked on a network communications part (the little PC) of a device that controls train signals (the smoke detector chip). This is exactly how it's architected -- the safety-critical components are isolated from the non- safety-critical ones. The busybox linux board does not go in between the sensors and the vital control logic, but rather "on top" ------ jaxbot Has anyone tried one of the newer revisions of the Protect and had any experiences, positive or negative, with it? I was initially excited when I heard about the product announcement. It seemed like everything I wanted in a smoke detector: remote monitoring, ability to silence, starting with voice before blaring and getting my adrenaline up, and, well, a smoke detector. But Amazon reviews and the infamous video later, I backed quickly away. I've been awoken by a false-positive smoke detector before and the experience is panic inducing on top of outright annoying. ~~~ DIVx0 I just bought a house that has two 2nd gen protects along with traditional style detectors, So far so good! No false positives so far. I reset both detectors and had them go through their self test when we moved in and they both report being fully operational. Neither of the protects are near the kitchen or regular smoke producing spaces though. However They are near bathrooms. I hear steam can set these off pretty easily but I have not had that experience. I might buy more protects to replace the other detectors but I want to wait a bit and see if nest comes out with any other products. ~~~ pcblues I would test them with smoke just to see how each behaved. Try with smoke from a candle just put out, a piece of paper just put out, and a plastic wire just put out. ------ ksk I think it also requires a mindset that a lot of young engineers don't have IMHO. Very few of them will ever ship code that will run 24/7 without patching, and be stable out of the box. ~~~ ocdtrekkie This really worries me when they start writing critical applications. I did have to reboot an old school program at work today. Very dated piece of software, pretty sure it predates Windows 7. It had been running, over a (fairly complicated) network connection for 303 days. I can't think of many modern pieces of software that do their job 24/7 for 303 days straight. And I'm pretty sure the reason it was 303 days, was because I had to move that machine last year. ------ MarkMMullin One thing I believe that's not being properly considered is the fact this is really a multilevel problem - if you want to make a smart IoT device you still want a reliable device. "On top of" is absolutely the best concept. Yes, the biggest computational bang for the buck comes from stuffing linux or something similar in, but who doesn't react with a smile when they see a multi year uptime reported - it's not rare, but it's not anywhere near a majority either - Start with the basic circuit, on top of that you add microcontroller logic to mediate between the higher level ARM SoC or whatnot and the base circuit, and just as importantly, you've also got a microcontroller based watch dog that will kick linux if it falls over. My experience in building this [https://hackaday.io/project/21966-quamera](https://hackaday.io/project/21966-quamera) is to never stop asking "Quis custodiet ipsos custodes" (anybody proffering analog solutions will be ignored, analog is hard and I am stupid) ------ jancsika > First, begin with a smoke alarm. A tried and true design that you can buy > for $10. Buy the exact parts that are already in use, and put them in your > final product. The smoke sensor, the transistors, the through-hole > resistors, the Fairchild IC, the 9V, the LED. Buy all of that and use it in > your final product. So... why didn't Nest do it this way? ~~~ debacle Software people designing hardware. ~~~ jancsika But why not piggy-back atop of the legacy hardware? Doing that would save time and let the software people solve higher-level (and presumably more interesting) problems. Besides, software people I know are way to lazy to go sticking their fingers in sockets. ------ nivramarvin The mobile-friendly view, recommended on top of the article - is kind of a usability disaster: I really like the authors approach using plain HTML. But using the mobile-friendly view has many usability pitfalls like: No links, no ability to copy text, no ability to search on page and switching to another browser-tab will lose the current position where you were reading at. As said in the article, sometimes its better to keep things which just work and are used by the majority like that. Same goes for using some styles or a blog template to provide a a fully mobile friendly read for the user, not a compromise of two bad options (plain article or mobile-"friendly" view). ------ luord All of these are excellent arguments against _most_ of the IoT practices, and not even including the security issues. All in all a great read[1]. These are all the reasons for why I'm not interested in IoT/embedded. I just don't have the hardware and low-level firmware knowhow and doing it the other way, starting with an OS and going down, seems (and evidently is) horrible. And I say this as a full-time software developer. [1]: Even if there might be inaccuracies; the author admitted the possibility and the point stands. ------ rubatuga I’m just wondering, as we shrink our gates for CPUs more and more (the gates are approaching 10nm soon), does it become less reliable? People always want the next fastest and most power efficient device, but surely we must be losing something with the shrink in CPU gate size! If that’s the case, are CPUs such as the Intel 286s more reliable? ~~~ pacificmint The vast majority of issues will arise from software defects, not hardware defects. So rather than switching to a 30 year old cpu, increasing the quality control and testing for the software would be the better aproach. Also, as some peope in this thread have mentioned, a smart smoke detector should be designed in a way where if it fails it becomes a dumb smoke detector, but the actual smoke detection and warning should still work. ~~~ sigjuice _The vast majority of issues will arise from software defects, not hardware defects._ Exactly. The last thing I want in my house is a smoke alarm with software written by idiots like me. ------ rubatuga A lot of the research into redundancy for computer systems has already been done. Check out this article from NASA when they began to implement multiple failsafe computers into their spacecraft. [https://history.nasa.gov/computers/Ch4-4.html](https://history.nasa.gov/computers/Ch4-4.html) ------ kootenpv The best article I have read in a long time. ------ frozenport Tldr; build an analog backup into a mission critical device. The argument about not reinventing the wheel might be misinformed, because their custom chip could actually be a well known design with a tiny tweek. ------ grapeshot If you're writing an article, is it really that hard to do a little research? Most of this piece is just "I didn't even bother to look up a teardown photo, but here's how I assume they must have designed it". ~~~ sgt He explicitly says that while he didn't tear down one himself, the folks at ifixit did and then he included a link to the photos.
{ "pile_set_name": "HackerNews" }
A History of Flavoring Food with Castoreum - tlrobinson https://munchies.vice.com/en_us/article/a3m885/a-history-of-flavoring-food-with-beaver-butt-juice ====== JohnJamesRambo People like Food Babe are this era’s snake oil salesmen. As a scientist, I’m constantly appalled at the amount of sheer ignorance and provably wrong facts that are out there concerning food and chemistry and believed by nearly everyone in those food and health circles. ------ gerdesj Britain is tentatively reintroducing beavers after wiping out the last ones in the 16 century. They might be handy for controlling rain run off - trials are on going. I'm sure it wont be long before someone decides to lick an arse by accident or something ...
{ "pile_set_name": "HackerNews" }
Announcing Kubeflow 0.1 - TheIronYuppie https://kubernetes.io/blog/2018/05/04/announcing-kubeflow-0.1/ ====== TheIronYuppie Heya everyone, cofounder of the project here. Thank you so much for the Kubeflow community for helping us reach this milestone! Our goal was to get the base level packages all wired up and let anyone start a machine learning project anywhere Kubernetes runs in just a few set of commands. Please let us know if you have any thoughts or what you'd like us to work on next! Disclosure: I work at Google on Kubeflow.
{ "pile_set_name": "HackerNews" }
Go team adding a Go language server to core - sqs https://go-review.googlesource.com/c/tools/+/136676#message-11c783bc9a9f6adf6119bbb85c89510fda25abe9 ====== dev_dull > _Tooling is so important to the developer experience for any language, and a > language server is one of the most important kinds of tooling there is > (because the user interacts with it on virtually every keystroke in their > editor)._ Here’s how languages backed by large corporations win. They have the resources and the optics to know why these non-fun features are important. ~~~ stcredzero Since when is tooling non-fun? This depends on a lot of factors. ~~~ asdkhadsj Exactly. Hell, I consider tooling to be some of the most fun _(and often difficult)_ things I work on in my free time. If I can make something easier for my normal dev environment I see huge gains ~40 hours a week. That's pretty damn awesome and flashy to me. It's definitely not easy, but imo it's well worth my time. ------ flaque Possibly dumb question: what is a language server in this context and what’s it used for outside of the AST package? ~~~ blcArmadillo Look at [https://microsoft.github.io/language-server- protocol/](https://microsoft.github.io/language-server-protocol/) for more info. It's a standard interface IDEs can use for things such as autocomplete rather than every IDE having to reinvent the wheel. ~~~ justaj Interesting, I wonder if Vim is able to use this protocol as well. ~~~ int_19h [https://github.com/prabirshrestha/vim- lsp](https://github.com/prabirshrestha/vim-lsp) [https://github.com/autozimu/LanguageClient- neovim](https://github.com/autozimu/LanguageClient-neovim) ------ shabbyrobe I'm not sure I fully understand the provenance of the code here, but I could just be being a bit thick. The files in the change request show the copyright as "2018 The Go Authors", but some of the commentary hints that this is code donated from the Sourcegraph implementation. If this is the case, does that mean Sourcegraph has transferred the copyright to the Go authors? ------ pokstad I’m going to guess this is in response to many community supported dev tools breaking in v1.11 when modules are turned on. Specifically tools like gocode, godef, and gogetdoc. ------ asdkhadsj Does anyone know if this is introducing more features than the existing Go LSP[1]? I'm all in favor of tooling support - I'm just curious if this actually adds anything to those of us already using a Go LSP. Thoughts? [1]: [https://github.com/sourcegraph/go- langserver](https://github.com/sourcegraph/go-langserver) ~~~ sqs Sourcegraph CEO here (and author of a comment in the link + contributor to [https://github.com/sourcegraph/go- langserver](https://github.com/sourcegraph/go-langserver)). The idea is that with a Go language server becoming a core part of Go, it will have a lot more resources invested into it and it will surpass where [https://github.com/sourcegraph/go- langserver](https://github.com/sourcegraph/go-langserver) is now. ------ vbezhenar It's interesting that Jetbrains refuses to leverage those language servers and writes their own parsers for every supported language. ~~~ gameswithgo very few languages implement the RLS well. For instance Rust does it, but its incomplete and very slow. I bought a quieter CPU fan specifically because of it! ~~~ int_19h It's more a function of language complexity than anything else. LSP for something like C, Java or TypeScript is not that hard, because the language is fairly straightforward to parse, and not ambiguous. But something like C++ or Rust, where you have to deal with templates/generics/traits/... and type inference starts getting gnarly. And something like JS or Python is hard because there are no static types at all, and you have to do inference. But LSP itself doesn't really make it easier or harder. If you look at other code completion implementations (in IDEs that don't use LSP), they're usually at a comparable level, if they exist. ------ adiusmus Every language has support ... except the ones that don’t. [https://langserver.org/](https://langserver.org/) ------ wasd I wish every language would do this. ------ lobo42 Cool! We wrapped the VS Code extension in the protocol to support it in gitpod.io. See here [https://github.com/theia-ide/go-language- server](https://github.com/theia-ide/go-language-server) Would be really awesome to use something maintained by the people who know Go best :) ------ wenc Curious about language servers (especially those implementing LSP): how do they achieve low-latency responses? My understanding is that JSON-RPC is used, which is relatively high overhead. Are there lighter weight alternatives? (gRPC?) ~~~ tragic Because they're typically running locally they're quick, or at least as quick as your hardware will allow. The json-related latency is typically dwarfed by the latency from actually doing the functionality (parsing the code for code completion, etc). So the real work is in having a good story for incremental compilation and a generally quick and robust compile toolchain. I guess that Go is in a good position here ... ------ mark_l_watson Good idea. Languages like Haskell benefit by having a language server that Emacs, VisualCode, etc. can all use. It just make good sense to do the work once and let all editors and IDEs use it. ------ fithisux This is great news. Hopefully we will see others follow. ------ logicallee 2 unrelated questions. (tl;dr at bottom) Background ---------- The two major advantages of Go are its clear, unambiguous and explicit syntax (unfortunately ugly) and very small (minimal) feature set, so that it is possible to reason about anything without any high context. The former means it is possible to pick up Go in a day or two and the latter means that there are no real tricks after a few months. That's it. All this has been done in an excellent way. There is no class system. Everything is simple. Explicit typing works. These design trade-offs have led to a language that is VERY easy to go back to after working in any other language. Almost all simple syntax mistakes by the programmer are caught before something successfully compiles. Once a programmer is used to it, which happens quickly, Go is the only language people write pages of code in that compiles on the first pass and does everything they hoped. "Less is exponentially more." History ------- Due to the history of Go it was _meant_ to be a systems language. It didn't really succeed at replacing C anywhere, except the web server: I consider Go like a web-safe C. (In no small part due to Google's work meeting its own needs.) Question 1 ---------- What I don't understand is that since speed and being a systems language were _explicit_ design goals, why the Go syntax is not suitable for tiny, embedded microcontrollers. If the answer is something about garbage collection, this doesn't really make sense for me, as good algorithms should scale from 0 upward. I don't see why there should be a huge jump from "not running Go yet" to "now you're running a Go runtime" that suddenly requires a large amount of free memory. Here is a post about this: [https://medium.com/samsara-engineering/running-go-on-low- mem...](https://medium.com/samsara-engineering/running-go-on-low-memory- devices-536e1ca2fe8f) It seems this is high. I don't understand why Go doesn't take its core, primary strengths outlined in the "background" section above and expand it into the lowest of the low-end devices. Why can't I compile a Go program targeting a microcontroller with 2-8 MB of RAM without issue? Go was designed as a compiled systems language, to replace C. Not as a scripting language to replace Python. Question 2 ---------- Since explicit (if very ugly) syntax that keeps people from making mistakes, and whose specifications can be kept in mind all at once, is a strength of the language and syntax, why can't I develop desktop GUI's in Go? It is a fast, compiled language. You might say the answer to that is that a windowing system _needs_ a class system but I disagree - that can't possibly be any more true than the idea that a web app needs a class system. Clearly people make web apps in Go without a class hierarchy. Nobody makes desktop apps (I mean the class of applications that web browsers, IDE's, photo editing software, basically everything on a list like this - [https://helpdeskgeek.com/free-tools-review/best-freeware- pro...](https://helpdeskgeek.com/free-tools-review/best-freeware-programs-for- windows/) ) in Go. Of course, when it debuted you could say, "That's just not what the team focused on", since it was backed by Google they needed it on the server. But at some point you'd expect first someone to make a make-do GUI toolkit, then other people to add on and make something better, and by now it should be an obvious language for making a Winzip or Winrar competitor. Or an antivirus. Or an Audacity or VLC-type program. Or an OpenOffice clone. But nobody does these things. _Any_ of these things. Why not? The simple language has nothing missing that languages which are used for these applications have. I don't buy the idea of the ecosystem entirely lacking these. Why don't people make GUI stuff in Go? I don't understand why it shouldn't be an obvious language choice. I just wouldn't reach for Go when it comes to scripting a GUI application. Why not? Summary / tl;dr --------------- 1\. I would like to know what it would take to make the simple specifications behind Go, which are protected by knights in shining armor, an obvious choice for embedded microcontroller applications in 2018. 2\. I would like to know what it would take to make the simple specifications behind Go, which are protected by knights in shining armor, an obvious choice for desktop applications in 2018. ~~~ logicallee Not going to edit the above but people clearly hate it (many downvotes without minutes of posting). At the moment I think most Go programmers who also program tiny microcontrollers with a few hundred KB or up to a couple of MB of RAM, do so in languages like C or Rust, not in Go. Why not? It's a fair question. For those of you reading this on a laptop or desktop, how many desktop applications are you all running at the moment or have as an installed application? How many are written in Go? Why so few or none? I mean desktop applications like Audacity for audio, or VLC, or Open Office, or your web browser, and so forth. The downvotes came pretty fast so you clearly have some ideas. I'd like to hear them. ~~~ striking >people clearly hate it Maybe because it's overlong, almost completely unrelated to the article, and might just be flamebait? But I'll toss in a few cents anyway. >Why can't I compile a Go program targeting a microcontroller with 2-8 MB of RAM without issue? Microcontroller toolchains are pretty controller-specific. It takes work to make things compile for them. There are other languages that fare better. Additionally, one of the more useful features of Go (concurrent goroutines) would simply be unavailable (and even with concurrency disabled, you still get unpredictable latencies from deciding which task should run when, the GC, etc). Just use C. >why can't I develop desktop GUI's in Go You can. [https://github.com/andlabs/ui](https://github.com/andlabs/ui) is an example of how you can (and I'm sure it's not the only one.) >Why don't people make GUI stuff in Go? It's common to write apps using whatever the OS provides or using well-known libraries, so that's something Go's offerings don't offer. It's so much work already to build a GUI app that most people doing so probably don't want to concurrently write a GUI library too. I don't think there's anything stopping you, though. Next time, avoid writing so much to ask two (three?) simple questions. You come off as having an attitude when you use phrases like "Not going to edit the above", "knights in shining armor", and so on. You also build up and tear down many bales worth of strawmen, which is pointless. It would have been a better use of your time to actually look into either of those fields and discover the answers for yourself. ~~~ logicallee Thanks for your responses, which I read carefully. I hope we can get a bit more out of this conversation. "Just use C" is odd to me given that Go was originally developed to be an alternative to C (a "systems language") - this was an explicit design goal, and mentioned in the first main sentence of its specification[1]: "Go is a general-purpose language designed with systems programming in mind." So when you then glance at the adoption for those uses, you would come to the (wrong) conclusion that it must have failed to meet its original design goal of being better for those general use cases. (Or people would be using it). According to its designers, C and C++ programmers don't transition to Go (as its designers had thought they would), but Python programmers do. So I'd like to understand what it lacks that is holding that back. I see it as clearly superior. Looking at the design, it looks like it met its design goal. So I see it as successful in its bid to be design that is a viable replacement for C as a "systems language", and then despite this successful design now nobody uses it as a systems language. It would be as though someone set out to make a better bicycle wheel, succeeded, and then nobody uses it as a bicycle wheel (despite its being a better bicycle wheel), while meanwhile the design is being used to haul thousands of tons of freight per year (server microservices). I'm asking, "so why don't we use it as bicycle wheels" and am told "just use a wheel". But I want to understand why, because it seems to me it met its design goal of being a better bicycle wheel. For the second statement, your statement about "common to write apps using whatever the OS provides or using well-known libraries," makes sense and explains why these libraries didn't exist when Go was made. I don't understand why they are still not a big part of the ecosystem nearly a decade later. The "knights in shining armor" is a serious statement: the biggest benefit of the Go language is its gatekeepers keeping things out of it, and its extremely orthogonal, tiny syntax, which can be kept in mind along with state. The results of code can be reasoned about without a lot of gotchas. I am therefore wondering what, if anything, these gatekeepers are keeping out which prevents the two parts of the ecosystem which I mentioned, from flourishing. I don't think Go programmers who are also C and C++ programmers who are used to writing for embedded microcontrollers or for Windows in C and C++ respectively, require deep (or any) changes to the language before _as a language_ it would be suitable to those two cases. You haven't mentioned any. So I still welcome more viewpoints. I don't see why Go can't replace C and C++ applications across all of the use cases I mentioned, with minimal first-party support by the language developers and maintainers. Doesn't Go also have alternative Go compilers based on the specification? I did a Google search for alternative Go compilers and the top result is from 2014: [https://groups.google.com/forum/#!topic/golang- nuts/EwDS15Ew...](https://groups.google.com/forum/#!topic/golang- nuts/EwDS15Ewfs0) As a followup, why is this the case? I mean, the Specification is quite clear, and it's a tiny language, so in the past 9 years I don't see why there aren't at least a few dozen independent implementations. Is it just not enough time, and we should wait a few decades? Is the issue that Google doesn't need Go for its desktop or mobile development stuff, and doens't need it for microcontrollers, and Go is not really an "independent" entity that is quite independent of Google as a company? Basically I am wondering why Go in 2018 isn't where C and C++ were by 1998, for example. Despite your detailed answers I don't see the missing pieces. Go as a syntax is better than C or C++ as a syntax. Its syntax and small and stable language features set are Go's primary advantages, which allows it run on the metal using clearly written standard Go libraries to serve web apps. To me, a microcontroller program running with next to no memory just has to not leak memory or crash, unlike Go running microservices, which have to withstand highly targeted and sophisticated attacks. Basically, given how I see Go (which is the reason for my very lengthy introduction, so that people could correct me if I'm wrong) I don't see why the language is not very far along where C and C++ blossomed as languages for embedded microcontrollers, and for Desktop applications. In a way you could say I don't really get what's preventing this. I can't think of any showstoppers. [1] Go language specification: [https://golang.org/ref/spec](https://golang.org/ref/spec) ~~~ striking I tried to find a succinct question to respond to for your first section, and couldn't. My answer is the same as before: no one has bothered to write a Go runtime that can run on microcontrollers. This is because Go does not do anything fundamentally different from C at that level, except things that are less suitable for low-memory and low-CPU environments; and because it takes a lot of work to write a compiler and runtime for these devices. Note that there are very few compilers for each individual microcontroller. They generally only have one or two. >I see it as clearly superior. You are not looking hard enough. Go has flaws, as does any language or tool. It is almost a rule of thumb that not understanding the flaws of a tool is a sign of inexperience with it or alternatives to it. I think if you actually tried to learn and use C and C++, you might better understand the difference. >I don't see why Go can't replace C and C++ applications My answer is the same as before: it can, except that C/C++ apps already have that territory, and Go doesn't provide a fundamentally different experience except in lacking libraries for most GUI applications. There are still places it shines, though. For example, it is doing a fine job of replacing C/C++ for cloud orchestration tools; static linking is important to those users, and Go provides it easily. >Basically I am wondering why Go in 2018 isn't where C and C++ were by 1998, for example. This is a mathematical error. The C programming language was developed in 1972, and C++ piggy-backed off of C's ecosystem very effectively. >Go as a syntax is better than C or C++ as a syntax I don't think I can come up with a better example of a "skin-deep" comparison, nor do I think anyone has actually argued for C/C++'s syntax being actually good. >stable language features Didn't Go 1.11 break all of the community's tools? Isn't that why the story we're posting on the comments on exists? >To me, a microcontroller program running with next to no memory just has to not leak memory or crash, Some microcontroller programs need to have very low latencies, or the physical things they are connected to will break (the GC defeats this). Some microcontroller programs are only allowed a couple of KB of RAM (Go can't handle that). Most microcontroller programs are compiled without a standard library (and Go would have to provide its own impls of basic Go stuff in order to function). Some microcontroller programs never allocate memory, cannot allocate memory, or must allocate all memory up front. And so on. You can solve these things by having more expensive and fault-tolerant gear, or buying more RAM, or paying people to write more Go compilers/runtimes, etc., but these things get expensive fast; that is often incompatible with the design goals of what microcontrollers are used for. You're welcome to write Go and slap it on a Raspberry Pi, though. >In a way you could say I don't really get what's preventing this. Try writing a microcontroller program and GUI app, both in C or C++ and in Go. I very strongly doubt you will succeed faster in Go than in C/C++, in either kind of project, because of multiple factors: ecosystem, communities, example code, language features, existing toolchains and compilers... But if you don't believe me, try it out for yourself. ~~~ logicallee I read this several times hours apart. Lots of information, thanks. I don't have many questions except this: "Some microcontroller programs are only allowed a couple of KB of RAM (Go can't handle that)." What in here... [https://golang.org/ref/spec](https://golang.org/ref/spec) ...can't you possibly implement in only a couple of KB of RAM? (For what required feature or features can we say for sure that a couple of KB of RAM is insufficient to implement as specified?) ~~~ striking As it turns out, the only mention of the feature is in that link's introduction. The garbage collector ([https://github.com/golang/go/blob/master/src/runtime/mgc.go](https://github.com/golang/go/blob/master/src/runtime/mgc.go)) makes many references to 4KB pages and 1MB arenas. This (the code and the user of the technique) would need some serious reconsideration for small devices. ~~~ logicallee Isn't the specification the actual language, and Google's compiler just one implementation? (That's true for C/C++.) Is the specification underspecified or something? We're not talking about a syntactically huge language...
{ "pile_set_name": "HackerNews" }
Technical Debt - BerislavLopac https://martinfowler.com/bliki/TechnicalDebt.html ====== ljm My only bone of contention is that taking on debt is usually a conscious choice. Technical debt to my mind is the price of making a pragmatic decision, it’s the trade-off that lingers. Making poor architectural decisions or writing poor quality code isn’t tech debt. They are just institutionalised consequences. The way the world now works. Legacy code isn’t tech debt either: it was right for the time. What you have instead is knowledge debt as engineers turn over, and the once intuitive and elegant parts of the codebase are utterly alien to the new hires who learned their craft in a totally different way. This isn’t tech debt because engineers can’t intuitively anticipate the business’s future hiring decisions, particularly when those are made independently of the team. If you want to establish tech debt you have to define the schedule to pay it off and there should be some record of that work somewhere, with accountability attached. It is not merely the accumulation of things you don’t like the look of or find hard to work with. Similarly, I think MVP is a conscious choice to incur product debt because it is far too risky and irresponsible to go all-in with your perfect vision right off the bat. You want your initial buy in and a chance to bail out or pivot before all is lost. Edit: as an example that comes to mind, a bank running on COBOL doesn’t have tech debt unless their COBOL engineers are aware of their missed opportunities. If they think COBOL is no longer fit for purpose, it’s not debt, it’s time to reinvest. ~~~ coding123 I think of all the code written as technical debt. As soon as you write it, it becomes a burden - just like regular financial debt. However instead of thinking of code like a credit card loan that will take a few years to pay off, think of it like a mortgage that you have for 30 years. At the 30 year mark, you will generally have retired the code, hopefully. If you're not so lucky, you should expect to see balloon payments coming shortly. I know this isn't a perfect metaphor because honestly "technical debt" applies to all code, not just slightly bad code. That's very different from a loan. ~~~ Jedd I suggest that all code is a _liability_ , rather than a debt per se. This assumes all things can be usefully viewed through the prism and language of economics (though my gut feel is few things can or should be). ~~~ aaronchall Liabilities _are_ debts. > liability: a thing for which someone is responsible, especially a debt or > financial obligation. According to the balance sheet equation, > Assets = Liabilities + Owners Equity Certainly all code _is_ a liability - it must be at least understood and maintained from time to time (80% of lifetime cost is in the maintenance). And while an accountant might also mark it an asset - not all code is an asset - and we recognize that fact when we delete for badness or disuse - or rewrite it to make extending and maintaining it easier. ~~~ Jedd As intimated, I'm not sure that the fuzzy language and world of economics is the best way to view such things, however ... I'd accept that debts are a subset of liabilities, but not that 'liabilities are debts'. A debt implies an externality, while a liability implies some generic exposure -- it's that which I focus on when considering (all) code as a risk to an organisation. I suspect, however, that we agree on the fundamentals. There's some cost and risk to developing and trying to maintain any code base, and accepting that can usefully inform development and operations work better than thinking of code as a (near-)risk-free asset. ------ rsweeney21 I cut my teeth as a developer on the Windows operating system. Because we were a platform, we had learned that when we shipped an API it became more or less permanent. So technical debt was a huge part of our planning and thought process. When I joined Netflix I was assigned to add DASH support to the Silverlight web player. I spent weeks working on the new architecture, refactoring the streaming code, etc. One day my manager stopped by my desk and said that "I needed to wrap it up. It was taking too long." I was still weeks away from being done. I explained to that I was cleaning up a ton of technical debt and that's why it was taking so long. He explained to me that they had different values on his team. 1\. Most code will be rewritten every 2 to 3 years. 2\. For code that doesn't get rewritten often, preserving battle tested code trumps paying down technical debt Just remember that technical debt doesn't apply to all software. ~~~ varjag > 1\. Most code will be rewritten every 2 to 3 years. a.k.a. defaulting on the technical debt ~~~ thaumaturgy No, this is still a payment against technical debt. From the top of Martin Fowler's article: "The extra effort that it takes to add new features is the interest paid on the debt." If adding a new feature starts with, "just throw out all the code and start from scratch", that's a much larger cost over the long term than building code that's easier to maintain and refresh. ~~~ sidlls With very few exceptions code that is "easy to maintain and refresh" over any appreciable time horizon (say,longer than 2-3 years) requires the kind of engineering effort that the modern world of "CS trivia is good interview material" generally derides. I'm not even talking about "real" engineering processes as might be used for space systems or whatever; I mean just basic requirements analysis and technical documentation. ------ kashyapc [I recall posting this here in the past, but worth re-posting.] The "handy rule" from the book _Team Geek_ \-- the newer edition is called _Debugging Teams_ [1]. It's from chapter "Offensive versus Defensive work": _[...] After this bad experience, Ben began to categorize all work as either “offensive” or “defensive.” Offensive work is typically effort toward new user-visible features—shiny things that are easy to show outsiders and get them excited about, or things that noticeably advance the sexiness of a product (e.g., improved UI, speed, or interoperability). Defensive work is effort aimed at the long-term health of a product (e.g., code refactoring, feature rewrites, schema changes, data migra- tion, or improved emergency monitoring). Defensive activities make the product more maintainable, stable, and reliable. And yet, despite the fact that they’re absolutely critical, you get no political credit for doing them. If you spend all your time on them, people perceive your product as holding still. And to make wordplay on an old maxim: “Perception is nine-tenths of the law.”_ _We now have a handy rule we live by: a team should never spend more than one-third to one-half of its time and energy on defensive work, no matter how much technical debt there is. Any more time spent is a recipe for political suicide._ [1] [http://shop.oreilly.com/product/0636920042372.do](http://shop.oreilly.com/product/0636920042372.do) ~~~ ben_jones I think that's a leaky abstraction. Code refactoring can have a huge performance impact that is 100% noticed by the end user. ~~~ tj-teej I think the OPs paradigm holds. If the Code Refactoring has a huge performance impact then it's Offensive work, if it has no impact, then it's defensive. ~~~ brightball IMO you just pointed out the key to selling the work. Reclassification. ~~~ hinkley It’s dirty accounting tricks for a greater good. I kinda resent the fact that we have to resort to this. ~~~ brlewis A huge performance impact is a shiny thing that is easy to show outsiders and get them excited about. There is no dirty accounting here. ~~~ jolfdb The dirty accounting is that doing work to prevent bugs gets no credit, but letting bugs happen and then fixing them gets credit. ~~~ wolf550e Even worse, introducing shiny features in half-baked bug ridden way gets credit, the person who gets to fix all the bugs and possibly reimplement the whole thing correctly gets no credit. ------ baby_wipe I've been thinking about the difference between video game software and most CRUD software. With most CRUD software you are expecting future changes, so you have to be more conscious about tech debt. But with video games (at least older ones), once you ship you're more or less done. I wonder if that gives room to take on extreme debt at the last minute before shipping a game. An example might be when Steve Ellis added multiplayer to Goldeneye on N64 a couple months before launch. Since they were going to ship a finished product, he was probably able to build it much faster without having to worry about code quality. [https://www.engadget.com/2012/08/14/goldeneye-007s-multiplay...](https://www.engadget.com/2012/08/14/goldeneye-007s-multiplayer- was-added-last-minute-unknown-to-ra/) ~~~ steveklabnik One thing that's interesting about this insight is that many games are moving _away_ from this model, with regular updates and continued development. This doesn't invalidate your point, of course, but I also wonder how big of a mental shift this must be, because historically, when a game was done, it was done. ~~~ aarongray That's a great point. One of my favorite games, Enter the Gungeon, had plans for a major release of DLC, but they canned it and replaced it with a much smaller scope of free updates because they had so much technical debt that they couldn't attempt the big DLC release confidently. [https://www.reddit.com/r/EnterTheGungeon/comments/9ykpgc/onc...](https://www.reddit.com/r/EnterTheGungeon/comments/9ykpgc/once_more_into_the_breach_an_update_on_gungeon/) ~~~ steveklabnik Ah, thanks for that! I love that game too, and forgot about this, it’s a great example. ~~~ aarongray Ah nice! :) ------ rossdavidh I like the metaphor of technical debt, but...this essay does not address the most prevalent obstacle with paying down technical debt, which is that the people making the decisions don't care. They are in short-term mode, just trying to get this year's objectives met and IPO or sell or get a higher up job in a different company or division. They really don't care about the long- term health of this software product. Technical debt, as a metaphor, may make perfect sense to them, but they aren't the one who will pay the interest. Now, if you're working on open source and will be in this codebase for a decade, or this is your company or your software, it makes the tradeoffs much clearer, and that works because your incentives are right. So this metaphor works better for helping the typical developer make choices well, than for helping the typical manager/director/CEO make choices well. ~~~ foolfoolz this is usually a communication problem from engineering. you have to be able to describe the impact of the risk and benefits and make a business case for it. no one cares if you want to upgrade a framework. but if you are going to improve customer trust by using a newer version of something and result in less bugs / downtime it sounds a lot better ~~~ rossdavidh Sure. But the risk of disruption is usually actually higher, in the short term, by paying off the technical debt. In the long run, of course, it will likely result in fewer/less risk of technical problems impacting customers. But the risk of there being a disruption right now, is nearly always higher if we pay down technical debt, than if we put it off. So again, you have to have something other than a short-term outlook on the part of the decision maker. Which does happen, of course, but not always, and not almost always either. ------ jaden One of the hardest challenges of paying off technical debt is that there's no visible improvement to the stakeholder. From a business perspective, they often think it's working now, why not leave it alone? Also, the metaphor starts to fall apart if you consider good debt vs bad debt in terms of financial investments. No examples of good technical debt come to mind, other than the benefit of shipping quickly. ~~~ LargeWu [https://www.youtube.com/watch?v=XakfJ2spb3w](https://www.youtube.com/watch?v=XakfJ2spb3w) A pretty good talk from Railsconf a few years back. The debt metaphor can be thought of as two axis: planned vs unplanned, and low vs high interest. If you explain to the stakeholder that by not paying down technical debt, they're only paying off the interest, but they have to keep paying it forever, perhaps that will sink in. The opportunity cost of those interest payments is profit-generating work. By failing to address technical debt they are constraining the business. ~~~ james_s_tayler I've come to adopt the metaphor of Technical Tax instead for this reason. Technical Debt is something for which you have a repayment plan and a timeline on which it will be repayed and you are making the repayments. Technical Tax is everything else. And your tax rate will keep climbing higher and higher unless you make a change at the policy level. ------ DoubleGlazing My previous job had an excessive amount of technical debt in it's core product. So much so that simple code changes that should have taken a few hours were now taking days, or in some extreme cases weeks. For example there were financial calculations that should have been confined to one class and treated like a black box, but over time bits of the calculation had spread out all over the application from the front end, the server layer and in to the DB. A simple change had a ripple effect on other parts of the application which made bug hunting and testing a slow and laborious process. After much grumbling from the devs the CEO prepared a presentation where he told us technical debt was our friend. We had a choice to pay down the debt, or add new features. New feature would get us more customers, and more income, which would mean we could hire more developers to eventually pay down the debt. I felt it was a very wishfull thinking type of argument. We were having increasing numbers of outages due to the lack of a proper testing process in our CI pipeline and our DB was massively overloaded. Adding new feature was literally adding to our technical debt and increasing our risk of outages, something that wouldn't help get new customers. My issue with the above argument is that the problem with debt isn't the amount or type, but the person holding the debt. A 30-something in a safe job (e.g. doctor) earning €100k a year can safely hold €300k of debt, so long as they are responsible and always strive to reduce the amount owed. A 20 year old working in a fast food joint on minimum wage, but holding €50k of debt is in serious danger. They aren't in a stable job and they aren't being responsible. Some companies are like the doctor. They have debt, but are responsible and actively work to reduce it. Some are like the 20 year old who thinks that they can deal with it later, oblivious to the rising interest and danger they are putting themselves in. ------ allenu I dislike the term "technical debt" as it makes it appear that it is something that _must_ be paid. It takes a stance that there is a correct way to do something and that if you owe a debt it's because you've strayed from the correct way and that you need to restructure your work to make it conform more to this correct way. I prefer to see engineering as a series of trade-offs. These trade-offs come about often as not having enough information about future scenarios and yes, sometimes they come about because you are in a rush to do something. Either way, whether something should be redone is dependent upon a lot of factors and calling something "technical debt" is a sort of way to politicize work as though it is "the right thing to do" and msut be done. This feels too simplistic to me. Often, shortcuts we take will live for a long time in our code and often it doesn't matter that it's "wrong". If we call something technical debt, we should only call it that at the time that we determine work definitely needs to be done, not as we're going along and thinking "this is not the ideal design" as this will just lead to YAGNI designs. ~~~ WorldMaker > I dislike the term "technical debt" as it makes it appear that it is > something that must be paid. You don't _have_ to pay other debts either, you just have to be prepare to deal with consequences of that (bad credit scores, bankruptcy). Real debt too you often have choices in what you pay off today, versus what can wait for next month or next year. There's also nothing inherently _wrong_ with taking on debt. Most people will have all sorts of debt in their lifetime, and some economists believe that debt is much more interesting economically than wealth ever is. Debt is not a moral proposition and calling something "technical debt" isn't saying that it is bad/wrong, it's saying that we're writing IOUs we may or may not have to cash at some point. It isn't necessary to avoid debt at all costs because it isn't wrong to have some debt on the books (beyond very conservative religious readings that feel that all debt is sinful, of course). ~~~ mobjack There is a lot of technical debt you can leave in the system without consequence. Tech debt mainly becomes an issue when making changes to the code. If no changes are needed in the code and it is working correctly, it is best to leave the debt in there. ~~~ WorldMaker I think that gets to other people's points that if it doesn't have consequences it is isn't likely "tech debt". If you don't "owe" it to someone, and often that someone is "future you", then it isn't really tech debt, it was just a trade-off or a poor aesthetic. We have other fun names for ugly aesthetic choices that work and get the job done and aren't likely to be replaced such as "jerry-rigged" and "Rube Goldberg machine" and "duct-taped shambles". ------ Eleopteryx This is painful for me as a Rails dev because the real technical debt is always a lack of test coverage. Cleaning up cruft is great, except without tests I'm just going to break my application in the process (and probably not know it.) Without first defining the behavior of the application, there's a reduced incentive to refactor anything. In this way, there's effectively two layers to the technical debt. This becomes a compounding problem, where the technical debt gets so great that it becomes more appealing to just keep pushing out features and dealing with the fallout. Everywhere I've worked, the code has ended up exactly this yucky. And yes I've contributed to it too. I just don't think that the idea of paying for someone to go back and make the code nicer with no tangible changes to the application's external behavior doesn't seem to resonate with my CTOs. Maybe I just need to find better places to work. ~~~ stank345 I have experienced the exact same thing in Python and it feels bad to not make changes out of fear of breaking something. So much so that I won't do refactoring on a large codebase written in a dynamic language with not great code coverage (i.e. pretty much every one I've worked on professionally). This is where static typing can help a _lot_ in my experience. Your types (and your function signatures) are contracts you still need to adhere to after you're done refactoring. Keep making changes until the compiler tells you it's all good and then usually it is. ------ kazinator That cruft is due to "requirements". It's easier to modify a system with less cruft, because modifying a system means adding, deleting or changing requirements. This is easier to do when there are fewer of them. If you think of a new requirement, it's easier to be sure that it doesn't conflict with six requirements than with sixty, or six hundred. Additionally, requirements are hard to remove/change once their implementations are deployed and depended on. Basically we are already saddled with debt at the requirement level, before we even consider the implementation quality. Thus, the first tool in the fight against technical debt is to very carefully manage the acceptance of new requirements. If a requirement doesn't exist, then its code doesn't exist, and that's as clean and maintainable as code can possibly be. Secondly, try to gather all of the requirements up-front before implementing anything, and then resist futher requirement creep. If most of the requirements in a system were gradually introduced after implementation began, that will tend to degrade the quality. ------ Traster I kind of like the idea of calling it technical debt- because I can declare technical bankruptcy (move companies). But really it doesn't work, partly because technical debt very often is actually the result of doing a slap-dash job in the past. If this code was unimportant enough to just throw together in the past you really need to ask yourself whether it's important enough today to fix. This feeds in to one of the fundamental personality types - someone who may be more interested in making the code beautiful and streamlined and neat than actually improving the bottom line. There are cases where you need to consider technical debt, but most often it should be considered as something you create rather than something you inherit- because the creation of the technical debt is where you're making the judgement: * Is delivering now important * Is this code going to need to adapt and develop in the future * Is the code this is building upon going to stick around or will this code likely be superseded. * (Selfish developer: Am I going to be around to deal with this) ------ Bokanovsky I think when trying to categorise technical debt the Technical Debt Quadrant is a better tool [0]. It's already linked to in the parent article. A lot of people are shoving all technical debt in the Deliberate / Prudent category. However I've seen a lot of code bases written by inexperienced developers and it would have be done completely differently if they had any experience (both of the inadvertent categories). This means actually updating the code base can be a complete pain. Sometimes you know if you're editing the code you risk creating huge regression issues and suddenly something that should have taken a short time can spiral out control if you want to add more functionality. [0] [https://martinfowler.com/bliki/TechnicalDebtQuadrant.html](https://martinfowler.com/bliki/TechnicalDebtQuadrant.html) ------ donmatito My framework to think through technical debt: I don't think of it as debt (negatively connoted) but business investment. When you take a debt it's usually to finance something. Focusing the wording on debt highlights the negative side of the trade off. It's like we're suddenly all turned fiscal austerity hawks, where any bit of debt is evil. I feel like developers are each placed somewhere on a continuum from purist to entrepreneurs. Purists can spend weeks to tweak an SQL query, while entrepreneurs would be happy with a no-code Zapier MVP that actually lands customers It's not to say that one is always better than the other. A balanced team probably needs both. In our company, I'm more on the entrepreneur side (which I guess is good for launching a startup) but the recent addition of a "purist" to the team has improved our codebase significantly. ~~~ wccrawford People who understand financial debt don't think of it as evil. It's a tool they use. Technical debt is the same way, and I often talk about it that way at work. But it's a phrase that means something that "business investment" doesn't. When I tell my boss that we're incurring technical debt (and what exactly that debt is), he is already in the "business investment" mindset on the matter. I'm the one trying to reign that in and focus on making the future easier by avoiding some of that debt now. But I fully understand both sides of the matter. ------ babesh 'technical debt' is a misnomer. The reality of most code, especially as it gets closer and closer to the end user, is that over time there will be requirements added and removed. It is really hard to perfectly design such a system from the start. Thus some level of technical debt is inevitable and it isn't really debt. For instance, an overriding goal of zero technical debt is not always the right decision. You may be expecting additional requirements and have not come up with a better conceptual model. This focus on the technical makes you lose sight of underlying issues which are often organizational rather than technical. Is someone neglecting the code intentionally to get a promotion for a rewrite? Is there a ship now or feature mentality? Are your engineers empowered to change the code? Other industries and systems have the same issues. What do they call this? ~~~ kraftman You can't redefine technical debt to include future requirement changes, and then say that definition is wrong and it isn't really debt. ~~~ babesh It's not future requirement changes. Its that the conceptual model that you constructed for your original requirements doesn't accommodate a new requirement. Now if you construct a new conceptual model, you will have to modify the original code. It sometimes isn't feasible to do this to each new requirement especially if you suspect that the next new requirement will require yet another change to your conceptual model. This can happen because the product roadmap changes (new PM, business changes, etc...) Try writing UI for several different designers over the course of several years for several different iterations of features when there is no common design language. ~~~ kraftman Technical debt is about quality of code, not about changes in your conceptual model. Lets say I had a warehouse that stored food, all stored neatly in its own section, each with the least frequently moved at the top where I need a forklift to access, and the most frequently used at the bottom where they are easy to access. Someone orders too many boxes of rice, and there's no room in the rice section, but there's room on the lower shelves of the sauces so it gets stored there, and some are left on the floor, partially blocking forklift access. The immediate problem of rice storage is solved, but every time I need sauces I have to grab the forklift, and it may take longer to get there and back because the route is blocked. This will persist until I refactor the warehouse so that there is more space to store the rice appropriately. Now the manager decides we should be able to store refrigerated goods, and I have no refrigerators yet. The work to refactor the shelves to make room for the refrigerators is a conceptual change in how I use the warehouse, and not a direct consequence of how I've previously been storing goods. It's not tech debt. ~~~ babesh Quality of your code can be affected because your conceptual model is muddy. Makes it more complex, harder to understand, harder to change. Makes you implement a requirement in two different ways. ~~~ babesh It looks as if people have different definitions of technical debt. ------ Gpetrium I think it is worth noting that different organizations/teams may require differing amounts of work related to technical debt. This is often driven by business culture, priorities, team experience, long term business goals and many more. From experience, a lot of people/companies have difficulties marketing what/how/why/when technical debt should be tackled for both internal and external stakeholders. This leads to a less than optimal structure where "new" or "shiny" work is preferable and rewarded, as opposed to a more balanced approach. ------ austincheney I disagree that productivity cannot be measured. Measure productivity by the time it takes a developer to achieve a proof of accomplishing the business requirement through test automation. If it takes a developer 1 hour versus two business days of 8 hours there is a 16x factor of productivity. Tech debt are all the factors that erode productivity. This includes many technical factors from inefficient systems design to unnecessary testing. It also includes slowness from weak developers and developer training. I know it puts me in an extreme minority, but I am a big fan of extreme simplicity, which is not what it might suggest. Extreme simplicity removes everything in a system present for the easiness of the developer so that all that is left is only that which is necessary to accomplish a business requirement as directly as possible. If there is less to read (including dependencies) there is less to maintain and less to test. Simplicity is not easy. The confusion is that a system must achieve a certain level of easiness or it will be rejected by some developers outright or require training at great expense. My basic premise for making any challenging decision is to never compromise on integrity, which includes being honest about the challenge at hand. You can embrace that challenge directly and solve for it or you can hide from it behind layers of easiness. ~~~ m0zg This assumes goal complexity is measurable and you can run randomized experiments to measure what you call "productivity". Neither of these assumptions holds in the real world. E.g. changing some business logic can be 10 minutes of work to implement a hack (which will pass tests, mind you), or 2 weeks of strenuous yak shaving for a "principled" solution. Which would you rather pick? And more importantly, are you prepared to select for developers who favor 10 minute hacks to principled solutions? ~~~ austincheney You measure productivity by the time it takes to complete a task. That is both objective and measurable. How do you define a principled solution versus a hack? All that matters is whether the result achieves the desired business goals without regression, and that is what tests are for. That being said I would gladly pick the 10 minute solution that passes all the tests. So much of development is bullshit posturing for developers to justify their existence with unnecessary tasks to make things easier for themselves. ~~~ m0zg You measure productivity that way on the assembly line or other menial tasks where it's obvious how to do things and the work isn't very complicated. Trying to do this with creative professions is a disastrous mistake. >> So much of development is bullshit posturing for developers to justify their existence That applies to any profession. Developers aren't unique in this regard. ~~~ austincheney > You measure productivity that way on the assembly line or other menial tasks > where it's obvious how to do things Isn't it obvious how to do things in programming? Many developers, in my corporate experience, occupy themselves with how to do things in code. That is exceedingly unfortunate. How to write code should be an obvious disqualifier before obtaining employment. Nobody would hire a lawyer, for instance, who didn't know how to a legal argument. Writing code should be as obvious as writing an essay, but I suspect many developers that are hiring probably have trouble writing essays as well. Anyways, if you had the choice between a 10 minute pizza and a 3 day pizza with ingredients more evenly spaced apart which would you choose? ~~~ jepcommenter Writing code is not a problem, changing it without breaking is. Taking tech debt is not a problem, paying interest on it is. And you will hardly hire a lawyer to work in unknown to him legal domain in foreign country in foreigh language. ~~~ austincheney > changing it without breaking is That is why the software gods invented test automation. > paying interest on it is That is why simplicity is important, because less is more. All code is ultimately debt as it demands some amount of maintenance. When there is less to maintain there is ultimately less debt. > And you will hardly hire a lawyer to work in unknown to him legal domain in > foreign country in foreigh language. Has that ever happened to you as a software developer? The one time I was told to learn a new language I was allowed the time and space to learn it. ------ zmmmmm The problem with the concept of technical debt is the implicit suggestion that you know what is debt and what is not up front. An awful lot of debt is stuff that people thought was needed but turned out to be totally off base and now weighs everything down. And large piece of that - not by any means all or even most, but a large piece - is people pre-emptively optimising the design to satisfy precepts put about by people like Fowler himself. ------ revskill To reduce/remove tech debt from my code, the only patterns i found helpful is: Separation of concern: The chunks of files/codes that work or removed together. One typical example in real world is React Hook. By moving boring stuff into a folder called "hooks", we centralize concerns into its own file, so that we could manage them effectively, or we don't need to think about it anymore. ------ glangdale I find the phrase 'Technical Debt' to be unhelpful. If I am in financial debt I will definitely need to pay it off unless I go through bankruptcy. Technical 'Debt' seems to operate by bad analogy: I may opt to completely wipe out the subsystem that has accrued the debt and replace it with something else, or it could turn out that the subsystem solves some problem that wasn't business- critical anyhow. I've had to restrain developers from "turd polishing" systems that were poorly thought out to begin with. Code gardening in a subsystem that was simply the algorithmically wrong way of doing it (that we might have had to have shipped quickly in order to stay afloat) isn't a first class activity. The problem is with the metaphor of technical 'debt' is that implies a balance sheet, but many (most?) of these bills will never come due. ~~~ Forge36 That's the difficult part of being new on a project. If you aren't building the replacement is progress being made to fix the issue. With regards to gardening: if you're spending all your time weeding but you haven't planted any seeds what are you growing? ~~~ glangdale Hopefully the manager has enough sense to help a new person see the priorities. As a manager I've definitely 'called time' on weeding. Good analogy. ------ z3t4 Something takes 20 hours to implement, but you do a hack job in 2 hours. Next time someone touch that code, they probably need to spend the 20 hours to re- implement it. Technical debt can can be made less costly by using the test- first approach, you start by writing a test that confirms what you are going to implement works. When doing test-first, it's often fine to just do the bare minimal work. If something needs to be added/fixed, write another test, then do the bare minimal change, repeat. Eventually the code will be so complex a small change will take hours, and that's when you have to pay the technical debt, but if you have all the tests, you can make a clean re-write, and the tests confirms that the new clean code works just like the old code. ------ rsweeney21 Businesses should probably start tracking technical debt as a liability on their books. Maybe depreciate code as an asset like a business does with a truck or piece of equipment. Management would probably understand the concept better if it was represented on the financials. ------ azernik A related concept: defaulting on technical debt. If you know a piece of crufty code is going to be irrelevant in the near term after some major architectural change or new feature, it isn't really worth it to pay down the principal with minor improvements or shims. ------ throwaway_ndiuf I think it's marginally useful to think of low vs high interest debt in terms of technical debt. I can make assumptions that certain areas of software will not change in the future, but that in itself becomes technical debt. Now future features have to be designed around the idea that certain features are off limits. Sounds extremely risky to me to try to hedge your bets on this kind of foresight. On the other hand, I think it's extremely useful to examine different libraries / modules as having different levels of technical debt, but the metaphor stays intact. You just look at your software not as a single loan, but many micro? loans. ------ 40acres Technical Debt becomes really toxic when mission critical customer facing functionality is based on it. Just this morning we received a ticket where cleaning up technical debt caused a major issue for the customer, when we looked even further we realized that this customer relied heavily on the state brought on by technical debt and that resolving this issue will require an even more careful refactoring to ensure that the debt is cleared but the overall functionality is unchanged. As a workaround for the time sensitive issue we literally copied the old code and overloaded our class so that the customer runs the old code while we develop a robust fix. ------ tcgv > Given this, usually the best route is to do what we usually do with > financial debts, pay the principal off gradually. On the first feature I'll > spend an extra couple of days to remove some of the cruft. This should be managed carefully to prevent a lot of independent gradual improvements that don't follow a clear architectural pattern and end up conflicting with each other. Furthermore, all code changes to existing modules require validation of the affected functionalities, which are often hard to spot. Hence, I'd say that a wide scale refactoring effort needs a clear testing plan before it's put into practice. ------ danielecook I think technical debt extends beyond poor quality code that prevents additions or modifications. Having worked with a large amount of scientific software it extends into the efficiency and robustness of software. Frequently, efficient methods or parallelization are not used and the least creative method is implemented to solve a scientific question. Functions are not written in a modular way but merely for a “quick and dirty” analysis. The end result is that you spend more time getting things to run or waiting for them to run than if you had gone back and reworked the code to begin with. It’s a serious drag on productivity. ------ JoeAltmaier A startup without technical debt, is mismanaged. The financial runway is the first consideration for survival. Spending a second on anything that doesn't get the startup to the next phase is a second squandered. Any exception to that? ~~~ p0nce Any market-leading product which is known for having the worst codebase out of all the competitors? ------ ineedasername This article frames technical debt in terms of sub-par code, which is part of tech debt but only a part. Legacy systems that are nearing end of life, patches and updates that haven't been applied, all of that represents technical debt as well. And when it comes to some of it, it can be actual financial debt too. Plenty of legacy systems sit around a long time past their sell-by date because of the financial cost of replacement, which may be orders of magnitude more than the cost of maintenance. ------ ozim Funny thing about technical debt with programming is that sometimes, and in my experience I did not had to pay it back... Software went to trash just because business changed direction. ------ flr03 A lot of people seems to complain about the wording. I understand the analogy doesn't cover 100% of the issue but unless somebody finds a better one this is the best we have. Good enough so that stakeholders do sometimes allow us to take care of it. Yes probably not all the time, and not enough, that's why we need to get better at communicating about it. In a corporate environment, communications skill are as important as technical skills, and we should seek to improve both. ------ Bombthecat Technical debt is like real world debt. You can either take the debt and try to increase your business revenue more than the interest of the debt. The same is true for the technical debt,or decision not to modernize / write shitty solutions to fix something quickly. In real world any business man would ask before taking the debt: is there a chance paying it back. But no one asks about paying technical debt back,or if ignoring it will yield a high enough return .. ------ clumsysmurf For anyone interested, a new book on the topic just came out "Managing Technical Debt: Reducing Friction in Software Development (Sei Series in Software Engineering)" [https://www.amazon.com/Managing-Technical-Debt- Development-E...](https://www.amazon.com/Managing-Technical-Debt-Development- Engineering-ebook/dp/B07QRT48T6) ------ erikpukinskis As a side note, I noticed Fowler linked the CannotMeasureProductivity page using camel case, and that took me right back! I remembered using PhpWiki and being able to auto-link pages just by using camel case. I don’t believe MediaWiki has that capability. And it’s such a standard I had forgotten it even existed. Something very soothing about the idea that links could be handled so gracefully. ------ branko_d This is not just about the code, but also about the feature this code implements. A feature implemented through "indebted" code tends to be buggier and less performant. There are exceptions, of course, but by-and-large this has been my experience. So the "interest" is not repaid by just the programmers, but also by the users, over and over again as they use the feature. ------ afarviral How many companies are profiting from a poorly written codebase that just works well enough to be a parketable product and isn't being maintained and there is no plan to maintain it? I can think of many utilities not being actively developed that are still part of or a product in their own right. Surely that isn't debt in any sense? ------ eddyg It's not "technical debt", it's "escalating risk". Technical debt gives the wrong impression to business people. [https://twitter.com/jessitron/status/1123310331957145601](https://twitter.com/jessitron/status/1123310331957145601) ------ largolagrande I really like the comparison of TD with the Tetris game : "You can’t win. You can only control how quickly you lose" Already posted on HN here : [https://news.ycombinator.com/item?id=19353352](https://news.ycombinator.com/item?id=19353352) ------ uberdru The concept of 'technical debt' seemed to emerge around the time that the concept of 'scoping' went out of of fashion. The best engineers that I have ever worked with were truly masters of scoping projects. ------ kissgyorgy > Stated like that, this sounds like a simple matter of working the numbers I don't think it is that simple, because the attempt to remove cruft might introduce new one and can make it even harder to reason about the code. ~~~ bradenb He goes on to say it isn't that simple because we're bad at estimating and measuring productivity. ------ adamdonahue It's only debt if you have to pay it back. ------ crimsonalucard Technical debt is a bad name for this phenomenon. The word debt implies that we know it exists at the time of taking on the debt and that it can be paid back. In reality a lot of technical debt can never be paid back without restarting from scratch, and a lot of it is accumulated unknowingly only to be discovered way later in the project. ~~~ sixstringtheory I would say technical debt is only that which you already know about but have made an intentional decision to take out a “loan” to ship quickly. I’m having trouble thinking of an example of financial debt that can be taken on unknowingly. Maybe identity theft? But that's less like an engineering tradeoff... more like NSA backdoor type stuff. ~~~ crimsonalucard Debt heavy choices made knowingly are more rare, as programmers consciously avoid it or choose debt that won't accumulate beyond certain bounds. Whatever it is, I think most projects deal with "technical debt" accumulated unknowingly. There's no theory around how to create a "good design" and thus it's often shooting in the dark or using preexisting patterns. Most software projects have design flaws that only become evident later. In my career, I have dealt with this type of problem more-so then debt accumulated knowingly. Additionally my current company is calling this type of debt, technical debt. I have rarely rarely seen technical debt taken on knowingly and documented. ~~~ helen___keller > Debt heavy choices made knowingly are more rare, as programmers consciously > avoid it or choose debt that won't accumulate beyond certain bounds I don't know what kind of software you work on, but I disagree. It's incredibly common, from what I've worked on, that we have an existing subsystem X, and project management wants it to interact with this new subsystem Y but it was never designed in a way to do so, so either we can hack it in real quick, build a moderately pleasing integration with only a little technical debt, or completely rebuild X from the ground up in a way that makes sense and leaves no technical debt. From my experience, bad managers will want the quick hack while engineers and good managers will take the middle option. Nobody opts for the complete rebuild. ~~~ crimsonalucard The design of subsystem Y is likely technical debt. The reason is, because of agile, software developers know that they must code defensively to account for feature changes or additions in the future. That means you take on technical debt every time you make a subsystem that isn't modular, meaning that if system Y is not infrastructure it should be able to be pulled out and used somewhere else. To not account for this feature is to not account for future change. A good way to test if your subsystem is modular is to ask yourself, can you pull your subsystem out of the current system and use it in a completely different app in a completely different context? If the answer is no, then likely the module is either IO or too tightly coupled with other systems. Too much Mocking in your tests is another sign of a design flaw. It's a signal that your code is dependent on other systems rather than a module that can be pulled out and reinserted somewhere else. Another design flaw is, is system Y itself made up of modules or interconnected dependencies? If subsystem Y needs to reconfigured to do something slightly different can I pull out 10% of the system and replace it with new modules? Or does pulling out 10% of the system involve pulling out another 80% of the system as a dependency? You wanted a banana but what you got was a gorilla holding the banana and the entire jungle. Most programmers choose dependencies because modules are harder. Also the dependency style of programming is heavily promoted. Additionally, most programmers are Unaware of what it means to make a modular component, they organize code following nomenclature rather than a systematic theory, and they think they are doing modular design but they are not. Likely your subsystem Y was designed in such a way that it is tightly coupled to another subsystem and can't be used outside of a certain context. Possibly it is tightly coupled with IO or some specific database schema. Likely system Y itself is made up of modules that are interdependent. Also likely it was programmed by someone who did all of this unknowingly. So it could be a matter of perspective. What you see as debt taken on deliberately I see as debt from the past "discovered" and interest accrued because the debt can't be paid back without a rebuild. ------ bubblewrap I always thought "technical debt" was just a buzzword consultants use to make the client pay more money upfront. To me it translates to simply "work we haven't done yet".
{ "pile_set_name": "HackerNews" }
A Dig Through Old Files Reminds Me Why I’m So Critical of Science - RougeFemme http://blogs.scientificamerican.com/cross-check/2013/11/02/a-dig-through-old-files-reminds-me-why-im-so-critical-of-science/ ====== ahelwer From personal experience (anecdote alert!), errors are also common in the ostensibly stone-cold-hard field of algorithms in computer science. A few years back I went on a string algorithm kick, and started dredging up old algorithm papers from the 80's on which to build Wikipedia articles. Often, the papers would get the _general idea_ right, but if implemented as described would not work at all or fail on edge cases. The best example I have is an algorithm to find the lexicographically-minimal string rotation[0]. The simplest and fastest algorithm to do this is based on the KMP string search algo, and is tribal knowledge among ACM ICPC competitors. I thought it was pretty neat and wanted to cement this algorithm in popular knowledge, so I set about researching and writing the Wikipedia article. I found the KMP-based algorithm in a 1980 paper[1] by Kellogg S. Booth. The paper has very detailed pseudocode which does not work. At all. The tribal knowledge version I inherited had similarities in the general idea of the algorithm (use of the KMP preprocessing step) but everything else was different. I scoured the internet for a retraction or correction, but all I found was a paper written in 1995[2] which mentioned in passing errors in the 1980 paper. I do wonder exactly how common this is. I emailed a professor who co-wrote one of the papers, and he replied that "it seems to me that all the algorithms (including our own) turned out to have errors in them!" Has anyone done studies into errors in computer science papers? [0] [https://en.wikipedia.org/wiki/Lexicographically_minimal_stri...](https://en.wikipedia.org/wiki/Lexicographically_minimal_string_rotation) [1] [http://www.sciencedirect.com/science/article/pii/00200190809...](http://www.sciencedirect.com/science/article/pii/0020019080901490) [2] [http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.55.9...](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.55.9144) ~~~ rrrrtttt There is a point of view that says that computer science conferences exist for the purpose of gaming the tenure system. The name of the game is plausible deniability: you're not supposed to submit papers that contain known false claims, but everything else is fair game. And this has become such an integral part of the culture that technical correctness is no longer a necessary condition for accepting a paper [1]. I think in this light it's quite clear why many scientists are happy to leave their papers hidden behind the ACM paywall. [1] [http://agtb.wordpress.com/2013/04/14/should-technical- errors...](http://agtb.wordpress.com/2013/04/14/should-technical-errors- disqualify-conference-papers/) ~~~ ahelwer Thank you, that was a fascinating read. It is understandable that technical errors are given a pass, as they aren't the meat of the paper. In the case of the Booth paper, I really should state I do not mean to attack him. The idea of using the KMP preprocess to solve the problem is a wonderful approach and works very well despite the actual implementation being technically incorrect. If I recall, the bug had to do with the termination condition; the algorithm had to run twice as long to terminate correctly. I will say my understanding of the algorithm improved as a result of debugging it! ------ mturmon The author says "Petrofsky was a lavishly honored star of the IEEE", but I was unable to find any honors he got besides an award for a paper in an IEEE journal ([http://ieee-aess.org/contacts/award-recipient/jerrold-s- petr...](http://ieee-aess.org/contacts/award-recipient/jerrold-s-petrofsky) \-- there could be other awards, but they don't show up in google). I thought "lavishly honored" would be shorthand for IEEE Fellow, but Petrofsky is not on the Fellows list. Saying his work was prematurely made into a biopic starring Judd Hirsch is not an indictment of science... ~~~ GalacticDomin8r Also he seems to lay at the feet of Petrofsky the failure of any further progess in the area and even seems to insinuate Petrofsky misled people intentionally. I don't follow him on point one, it seems a non-sequitur. As for the second, I'm going to quote a section from one of the articles he linked to: \-------------------------------------------------------------------------------------- “That was just for that event,” Davis, now Nan Huckeriede, said of her brief, but famous walk at graduation. “It was the computer-controlled electric stimulation, not me.” Davis had met Petrofsky while she was in college. “I was attending the WSU Lake Campus, and went to a spinal cord society conference there,” she says from her St. Marys home. “Jerry (Petrofsky) was a presenter, and afterwards, I introduced myself and told him I was interested in his research. “For about a month, I drove back and forth to Dayton to work with him, and then I transferred to the Dayton campus.” Following her graduation walk, Davis returned to her wheelchair, stayed in Dayton a few years, married, and then returned to St. Marys. “Jerry moved to California and stopped his research — I think he felt that he had gone as far as he could,” said Huckeriede. “But I still use the equipment he developed to get my exercise.” Last summer she traveled to Beijing for a procedure to strengthen her back and stomach muscles. “It didn’t work, but I knew it was experimental. It was worth a try.” \-------------------------------------------------------------------------------------- Given all this, my take away is that John Horgan has axe to grind for almost 20 years now and it's still not sharp. ~~~ mturmon Nice find. That kind of testimonial from the subject of the work is really important. Another relevant item. The magazine he wrote for, _The Institute_ , is a general-interest magazine of feature stories related to IEEE members. It's not a technical journal. It's more akin to the feature newsletters published by universities or engineering schools and sent to their alumni. The general-interest _technical_ IEEE journal is _Proc. IEEE_ , which is peer- reviewed and contains research articles and research summaries written by the experts themselves. ------ tokenadult I remember some of the same overhyped news stories the science journalist who wrote the article submitted here remembers. I especially remember the breathless (and false) reports about a "gene for" this or that human behavioral trait. The science news cycle[1] frustrates journalists, because every new study with an incremental finding (which may not even be replicable) has to be hyped up by research organization press offices, in the interest of obtaining more funding. The author's follow-up on a famous science story from early in his career is thought-provoking. Indeed, editors are more nervous about publishing stories, even very well reported stories, that question good news and expose hype or even fraud than editors are about publishing stories on the latest science hero. On the whole, it's good news that more and more scientists and journalists are alert to the possibility that a preliminary research finding may be false and overhyped besides. Here on Hacker News, we can keep one another alert by remembering the signs to look for whenever we read a new research finding news story.[2] Hacker News readers who want to learn more about how research articles become retracted may enjoy reading the group blog Retraction Watch[3] compiled by two veteran science journalists with lots of help from tipsters in the science community. I think I learned about Retraction Watch from someone else's comment here on HN. [1] [http://www.phdcomics.com/comics/archive.php?comicid=1174](http://www.phdcomics.com/comics/archive.php?comicid=1174) [2] [http://norvig.com/experiment-design.html](http://norvig.com/experiment- design.html) [3] [http://retractionwatch.wordpress.com/](http://retractionwatch.wordpress.com/) ------ powera There are always people who say "Science is still the best way of determining true statements" in response to these articles. But this isn't science. It's pure politics. And politics is probably the worst way of determining true statements. ~~~ crusso Exactly. The very nature of the tools used to advance and succeed in politics are anathema to doing good science. Treating science like you treat politics or marketing is akin to going to war in the name of Jesus or Gandhi. ~~~ dnautics Does it not seem silly, then, to use a political apparatus to fund science? ------ thrill "media hype can usually be traced back to the researchers themselves" Journalist investigates media hype and lays blame not on the media. Film at 11. ~~~ capnrefsmmat You might enjoy this paper: Gonon, F., Bezard, E., & Boraud, T. (2011). Misrepresentation of Neuroscience Data Might Give Rise to Misleading Conclusions in the Media: The Case of Attention Deficit Hyperactivity Disorder. PLoS ONE, 6(1), e14618. doi:10.1371/journal.pone.0014618.t003 It's thankfully open-access: [http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjourna...](http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0014618) They argue that a few prominent examples of misreporting in the media come from researchers misrepresenting their results in their abstracts, and journalists rarely read past abstracts. ~~~ musicaldope Interesting. Seems maybe journalists should rely on researchers other than a study's authors for interpretation? I don't mean that sarcastically (well, a little), but it seems that there's a fairly simple solution to this problem. ------ makmanalp > I wrote a puff piece about Petrofsky–based primarily on interviews with him > and materials supplied by him and Wright State–published in the November > 1983 issue of The Institute, the monthly newspaper of the IEEE My, could this be part of the problem? > It never occurred to me to question Petrosky’s claims. Or this? Maybe, just maybe, it's very hard to get good reporting on something by people unqualified in a subject? ~~~ anigbrowl He criticizes himself for this too. But he didn't make the guy famous; Petrofsky was already a star, the subject of laudatory movie, and so forth. He became more skeptical and challenged Petrofsky's claims in print in 1985. I think that's a pretty good turnaround time. ------ rtpg >“Academic scientists readily acknowledge that they often get things wrong,” The Economist states in its recent cover story “How Science Goes Wrong.” “But they also hold fast to the idea that these errors get corrected over time as other scientists try to take the work further. Evidence that many more dodgy results are published than are subsequently corrected or withdrawn calls that much-vaunted capacity for self-correction into question. There are errors in a lot more of the scientific papers being published, written about and acted on than anyone would normally suppose, or like to think.” I think the fact that people rarely retract can also be more of a practical issue than anything. Some guy wrote his master's thesis 3 years ago and it's in fact wrong? The author's too busy with his real life to have been keeping track of that. Or he's working on some other domain. Or the paper was written 30 years ago. I've also heard a lot of people say that a lot of research happens in the dark in many domains. People doing research in Haskell will gladly talk about their work on mailing lists it seems, but when it comes to chemistry, apparently it's a whole lotta silence. Pretty depressing. ------ pasbesoin One of my "pet peeves" (that has actually caused me a fair amount of aggravation and "expense", personally): Absence of evidence is taken (or, _insisted upon_ ) as evidence of absence. I've encountered this particularly in the medical community. As sort of a TL;DR: Many medical practitioners seem at best to be... "technicians" who are not much capable of more that following the "current script" that is handed down to them from blessed authority figures (including particularly if not only the pharmaceutical companies). P.S. I'll add insurance companies to the mix of authority figures, particularly in the U.S. They categorize and dictate what they will and won't pay for. Better doctors sometimes spend a lot of time finding ways around these restrictions in order to provide the treatment they think is actually appropriate and optimal. I would understand using statistical evaluation to help determine the best treatment approach. But when the profit motive enters, combined with a more or less fungible population of insured, the number crunching seems often to put cost ahead of outcome. ~~~ mikeash Absence of evidence _is_ evidence of absence, provided that you've actually looked. However, it's often not very _good_ evidence, and far from proof. But it's definitely evidence. ~~~ pasbesoin Yes, I think "proof" is a better word for what I was trying to describe. ------ kghose I did not take this as an anti-science article, but as an article critical of how academic science/engineering has become hype driven. Perhaps it was always hype driven, but I find it to be particularly bad now. Before you could do 'regular' work i.e. work which was scholarly and did not have terms like 'first to show' and which did not appear in the tabloids 'Science' and 'Nature' and still advance your career and get funding. Not so much any more. ------ d4vlx In other words, science is hard, predictions are unreliable, scientists are humans and humans make mistakes / are motivated by emotions. As usual with pieces critical of science they focus too much on a very small number of bad eggs and seem to implicitly assume that scientists should somehow be superhuman. ~~~ naterator > a very small number of bad eggs Even if there are a lot of bad eggs and hype and bullshit, you have to ask yourself what the alternative is. The non-science-based existence we suffered through for millennia? I think not. Excuse us for trying to cure cancer and failing less than 100% of the time. Part of the reason there is so much hype and bullshit is because, if we weren't cramming it down everyones throats for the 30 seconds they'll pay attention, there would be no money funding science and we'd still be living in our own filth and praying to god that the plague stops. ~~~ crusso _Excuse us for trying to cure cancer_ That's a total misrepresentation of the complaint. The complaint is that claiming to cure cancer or being close to curing cancer to get some funding hurts the credibility of Science as an institution. _Part of the reason there is so much hype and bullshit is because_ How will training the public that Scientists are money-grubbing hucksters who are full of crap help the matter any? A lack of humility and self-criticism is a huge problem in any discipline, especially one that claims to be the best way to learn the "truth". ~~~ dnautics _How will training the public that Scientists are money-grubbing hucksters who are full of crap help the matter any?_ Also: It just might teach the public to actually take agency over who gets funded and encourage them to decide for themselves who is or isn't a huckster. Incidentially, I _am_ trying to cure cancer, and I've set up a nonprofit to do so... And am considering writing an piece explaining why you _shouldn 't_ donate to me. (if you can't take the risk of failure, etc.) What do you think? Although I'm being genuine, is it too humblebraggey? ------ hacknat I think it's really important to talk about what Science is. It gets bandied about like it's this abstract idea when, in fact, it is a real process that is taking place. A good definition of Science is humanity's current working knowledge of reality based on the process of lots of people utilizing the Scientific method to test hypotheses. I think that's fair definition. On examination of this definition you will notice it has a large human component. Science is like the stock market. It's lots of people spit-balling about what's happening in the market (in Science's case, the market is "ideas about reality"). In the short term Science can look really ugly, just like the stock market can, in the long term, however, we'd like to think of it as an accurate weighing machine. I generally think this is a fair assessment, but as I get older and start to see how few people there are who aren't willing to cut corners to get ahead. This worries me, because, like the stock market, Science affects real peoples lives. It's all well and good that over the course of 100 years the Dow Jones will outperform cash or, really, any other investments, but that's of little use to the real people who get left behind in periods of great economic stagnation. Science can go through similar periods of stagnation, and currently, it seems like we might have hit upon one. It is possible to criticize the current way we have set up the Scientific endeavor without criticizing the abstract notion that human beings will generally discover new things over the long term. I think it is hardly controversial to say that our current way of doing things is not the best, but it may even be bad. Money and time are corrupting factors. Postings that used to require a PhD require a post-doc, positions that required a post-doc, now require two. The immense pressure of publish-or-perish is becoming greater and greater and room for failure, which is an essential part of the Scientific method, is being squeezed out. This is not a good thing and, I think, is a larger reason, among others, why Science is becoming noisier and noisier. When the stock market becomes noisy it benefits insiders, but hardly anybody else. I think Science is, currently, in a similar place, it's efficacy is being diminished by crap. ~~~ lutusp > I think it's really important to talk about what Science is. It gets bandied > about like it's this abstract idea when, in fact, it is a real process that > is taking place. But science isn't defined by its process, it's defined by its philosophical axioms, its foundational rules. The first and most important rule is that a scientific theory must be potentially falsifiable in practical tests -- if there's no empirical testability, there's no basis for falsification, therefore there is no science. The second rule is that scientific ideas cannot ever be proven true, only false. The third rule is that an idea without supporting evidence is _assumed to be false_ , not true (this is known as the "null hypothesis"). The remaining rules are comparatively unimportant -- these are the big three, without which any discussion of science is pointless. Science's process can change, and from field to field, it certainly does. But the rules stay the same. > Science is like the stock market. It's lots of people spit-balling about > what's happening in the market ... That is not science. To call that science is like confusing a spacecraft with a conversation in a bar about a spacecraft. > It is possible to criticize the current way we have set up the Scientific > endeavor without criticizing the abstract notion that human beings will > generally discover new things over the long term. Again, that is not science. Science's goal is not discoveries, its goal is to reliably refute ideas that are false, primarily by comparing them to reality. This is why science journalism articles that trumpet breakthroughs, with rare exception do a disservice to both science and journalism. ~~~ calibraxis Science doesn't have such axiomatic rules; did Galileo lay out these these axioms and proclaim the scientific revolution? As I understand it, science is a human enterprise with the goal of understanding principles... with limitations and strengths. Falsifiability in particular has serious criticisms, in terms of people taking it as a defining part of science. ([http://en.wikipedia.org/wiki/Falsifiability#Criticisms](http://en.wikipedia.org/wiki/Falsifiability#Criticisms)) I suspect (and it's just pure suspicion for now which I'm mentioning for no particular reason) it tends to be emphasized in cultures interested in debunking people's claims in a competitive debating way, rather than constructive conversation where both parties aim at coming to new understandings. I don't mean in science, but cultures influenced by science's success. ------ plg Nobody ever got a raise from their Dean or an endowed Chair because their work was celebrated for being careful, thoughtful, measured, balanced and realistic. Plenty of science that was uber-hyped at the time has turned out to be misguided and/or even outright wrong. Plenty of those scientists have led wealthy, rewarded lifestyles as a result of the hype. As an academic scientist one has to make a conscious decision to play the game or not play the game. There isn't a lot of room in the middle. You make your choices and you live with the consequences. You see your colleague making double your salary, you read the press office reports hyping their work, you understand that it's no more innovative, important, or TRUE than your work or anyone else's in your cohort ... but they are playing the game. Wouldn't you like to take your family to Hawaii for vacation? Wouldn't you like a bigger house? A nicer car? To send your kids to private school? Your University press office is practically going around begging for science stories to promote (i.e. hype). It's difficult to resist jumping in with both feet. It's a jungle out there people. ------ shadowOfShadow As with many things, we incentivize the wrong things. We pay for the headlines with clicks and eyeballs - we will get more headlines. ------ jamesash Reading through this article reminds me of why I'm so critical of science journalism: it's in the attention business, not the science business, and directing undue scepticism about so-called "breakthroughs" would kill a lot of great "stories". ------ enupten With the reward systems currently in place, what else would you expect ?
{ "pile_set_name": "HackerNews" }
(UK) CMA tackles undisclosed advertising online - DanBC https://www.gov.uk/government/news/cma-tackles-undisclosed-advertising-online ====== DanBC HN has talked about ad-blocking, and about how that might increase the use of paid-for-content. This article is a reminder that the paid for content needs to be clearly marked as an ad in the UK. This article is from the Competition and Markets Authority (a non-ministerial department of UK government). But there has been enforcement action from other regulators too. Here's what the Advertising Standards Authority (an industry group) say about vlogging and sponsored content: [https://www.asa.org.uk/News-resources/Media- Centre/2015/New-...](https://www.asa.org.uk/News-resources/Media- Centre/2015/New-vlogging-advertising-guidance.aspx#.VwJKipwrK01) [https://www.cap.org.uk/Advice-Training-on-the- rules/Advice-O...](https://www.cap.org.uk/Advice-Training-on-the-rules/Advice- Online-Database/Video-blogs-Scenarios.aspx#.VwJKjJwrK01) Notice that the vlogging guidance says the marking-as-an-ad needs to happen _before_ the user clicks anything - a message at the beginning of the video is not sufficient.
{ "pile_set_name": "HackerNews" }
"X-" deprecated for HTTP headers - michaelfairley http://tools.ietf.org/html/rfc6648 ====== Jach I was going to make a joke that "X-Subliminal" from <http://tuxgames.com> should become "com.tuxgames.subliminal", but I see they already made it for me... > _In some situations, segregating the parameter name space used in a given > application protocol can be justified:_ > _1\. When it is extremely unlikely that some parameters will ever be > standardized. In this case, implementation-specific and private- use > parameters could at least incorporate the organization's name (e.g., > "ExampleInc-foo" or, consistent with [RFC4288], "VND.ExampleInc.foo") or > primary domain name (e.g., "com.example.foo" or a Uniform Resource > Identifier [RFC3986] such as "<http://example.com/foo>). In rare cases, > truly experimental parameters could be given meaningless names such as > nonsense words, the output of a hash function, or Universally Unique > Identifiers (UUIDs) [RFC4122]._ ~~~ raldi Could you explain the reference in your first paragraph? ~~~ krakensden Open tuxgames.com with the Chrome Inspector or Firebug on, and look at the reply headers. They have an unusual one: X-Subliminal:You want to buy as many games as you can afford ------ drsim Hmm... I've worked on software at three different orgs that rely on the X-Forwarded-For header to identify a client's IP address from in front of a firewall (I'm not a network guy, but I think it was Cisco equipment). I agree the X- prefix ain't great. Is there an alternative for the originating IP address? If not, we need one. ~~~ apgwoz Well, I guess their hope is that people start using Forwarded-For, and then it becomes standardized in the same way that X-Forwarded-For has sort of become standard. ~~~ drsim Thanks. Draft here: [http://tools.ietf.org/html/draft-ietf-appsawg-http- forwarded...](http://tools.ietf.org/html/draft-ietf-appsawg-http-forwarded-04) ------ yaix Good thing. Next, please convince browser vendors to support HTTP methods (PUT, DELETE, etc.) in forms. ~~~ MatthewPhillips You're probably aware of this, but for those that are not, you can easily get around this problem by adding a hidden form field with X-HTTP-Method-Override as the name and the method as the value. Assuming your web server supports the field. ~~~ MarceloRamos Awesome, I don't know if I'll ever use it, but it's good to know. ~~~ MatthewPhillips If you want to build a RESTful website, it's really hard to get around it. Consider logging out. You want to send a DELETE to /Session. So you can just do this: <form action="/Session" method="post"> <input name="X-HTTP-Method-Override" type="hidden" value="DELETE" /> <input type="submit" value="Log off" /> </form> ------ natrius Next stop: CSS vendor prefixes. ~~~ lloeki Not. How would you handle the case of pre-standard implementations? Like: -webkit-gradient(<type>, <point> [, <radius>]?, <point> [, <radius>]? [, <stop>]*) vs: -moz-linear-gradient([ [ [top | bottom] || [left | right] ],]? <color-stop>[, <color-stop>]+); which eventually became: linear-gradient( [ [ <angle> | to <side-or-corner> ,]? <color-stop> [, <color-stop>]+ ) In CSS, the vendor prefixes are _exactly_ what we need. They help bootstrapping, discussing, and proofing standards by having pre-standard implementations in the wild. CSS is vastly more complex than HTTP headers which are most of the time, key- value(s). ~~~ regularfry You can have pre-standard implementations without vendor prefixes. Here's an argument against them: http://www.quirksmode.org/blog/archives/2010/03/css_vendor_pref.html And here's some further discussion: http://www.quirksmode.org/blog/archives/2010/03/css_vendor_pref_1.html Personally I think vendor prefixes are an awful hack. We've already got browser-conditional comments in HTML, why not use them? Hell, browser-specific stylesheets would be a better option from where I'm sitting. ~~~ tszming >> We've already got browser-conditional comments in HTML You mean the `conditional statements` introduced and only used by IE? <http://en.wikipedia.org/wiki/Conditional_comment> ~~~ regularfry Yep. It's a better plan than vendor prefixes. Edited to clarify: I mean that browser-conditional comments should be implemented across browsers, not that we should rely on features already implemented. ------ Sniffnoy More than just HTTP headers! ------ firlefans Firstly, I'm only discussing it's application to HTTP and I realise this is a more general rule. Customer X- headers are a hack, but they're also incredibly useful for simple debugging ala FirePHP, for transmitting application specific metadata or caching metadata in dev environments. The RFC even mentions a similar objection under three 'primary objections to deprecating the "X-" convention' - BCP 82, <http://tools.ietf.org/html/bcp82>. Also, this statement, "the name space is not limited or constrained in any way, so there is no need to assign a block of names for private use or experimental purposes", this is true only for application/browser vendors (native devs), web devs are stuck with X- headers for sending custom data to the browser without interrupting page output. ~~~ andrewaylett I think the point is more that, instead of using X- headers you should just declare your header de-facto standard (because you're using it!) and name it without the "X-". The issue with prefixing headers with "X-" is that, in theory, when people start using it you've got to remove the "X-" at some point -- but that would cause all sorts of compatibility headaches. ------ phibit Does anybody know what the recommended practice is? ~~~ momokatte It's not just about the prefix -- the goal is to get rid of the unstandardized parameter namespace entirely. New parameters should be named for permanence, so if they do become standard there's no need for migration. <http://tools.ietf.org/id/draft-ietf-appsawg-xdash-05.html> ------ sleepyhead What is wrong with custom http headers? Like these from Cloudfront: X-Ua-Compatible: IE=Edge,chrome=1 X-Request-Id: 19c0a05fd28e371127a76f658203eace X-Runtime: 0.002039 X-Content-Digest: 2c4b5e9fc815a69ecb514e41e27e6ac7f2716801 X-Rack-Cache: miss, store X-Varnish: 1299462197 X-Cache: Hit from cloudfront X-Amz-Cf-Id: kDpGVcaIcDVQm- PkMd_FPnR_IcPZqPgR0NxKEvmOBWKaURpeus5vEw==,_ycCBrLJT91EOYt14GTciFKKLlpzLt1cogvPpK2PkDP7QzYVGbcsGQ== ~~~ jjguy Appendix B: The primary problem with the "X-" convention is that unstandardized parameters have a tendency to leak into the protected space of standardized parameters, thus introducing the need for migration from the "X-" name to a standardized name. Migration, in turn, introduces interoperability issues (and sometimes security issues) because older implementations will support only the "X-" name and newer implementations might support only the standardized name. To preserve interoperability, newer implementations simply support the "X-" name forever, which means that the unstandardized name has become a de facto standard (thus obviating the need for segregation of the name space into standardized and unstandardized areas in the first place). Most of your examples are covered under the "exception 1" clause, also in Appendix B: In some situations, segregating the parameter name space used in a given application protocol can be justified: 1. When it is extremely unlikely that some parameters will ever be standardized... 2. When parameter names might have significant meaning... 3. When parameter names need to be very short (e.g., as in [RFC5646] for language tags)... ------ mildweed So, now we in the community need a clearinghouse to declare new HTTP headers. Let me be the first: Comment. <?php header('Comment: some context on why this reply happened'); ?> ------ saurik This was also discussed when it was a draft a few months ago. <https://news.ycombinator.com/item?id=3539663> ------ pacoverdi I guess django will have to find a replacement for X-CSRFToken ~~~ apgwoz Yes, CSRFToken. ------ SimHacker Does this mean X-Windows is finally obsolete? ~~~ astrodust Technically there's no plural in the "X Window System", X or X11 for short. (<http://en.wikipedia.org/wiki/X_Window_System>) Windows is a Microsoft product. ------ jpswade ...and while we're on the subject HTTPD's should return your IP address in their headers. This would make IP discovery easy peasy! ------ shasty I dont know if this will get approved but forcing us all to integrate headers into HTTP standards is not going to work. Thats why X headers exist. We need ways to move forward or e ven sideways if thats what we choose using an extension mechanism that is simple.
{ "pile_set_name": "HackerNews" }
Cabyn: A Social Network to Meet New Friends - alteredorange https://cabyn.co/home ====== marc0 It seems that the site requires location information and cannot cope with location blocking. Also, I get an error "Location information is unavailable." ~~~ alteredorange Yes, it is location based, so it needs your location, or it can't do anything. Depending on your browser you should be able to click something like "allow location". ~~~ marc0 yes but what if my location is not determined correctly by the browser (eg when connected via a VPN with entry point in another city/country)? I must be able to specify my location manually. ------ wheresvic1 The ui is broken - if you click getting started, you receive a blank page :( ~~~ alteredorange It should take you to the login page, you didn't see any login buttons?
{ "pile_set_name": "HackerNews" }
Ask HN: What is the best blog/article you read on internet? - blohs ====== davidmurdoch I love Wait But Why's post on AI: [https://waitbutwhy.com/2015/01/artificial-intelligence- revol...](https://waitbutwhy.com/2015/01/artificial-intelligence- revolution-1.html) ~~~ TheAlchemist This one is great ! But you could as well post half of Wait But Why posts - it's really one of the most interesting blogs on the web. I love this one more though (much shorter, but makes you think, and once you start thinking about it seriously...): [https://waitbutwhy.com/2016/10/100-blocks- day.html](https://waitbutwhy.com/2016/10/100-blocks-day.html) ------ sidcool There is not speed limit: [https://sivers.org/kimo](https://sivers.org/kimo) ~~~ shubhamjain This is an amazing article that I had read a few years back. Incidentally, I had tried many combinations of keywords to find it again but no avail. Thanks to you, my search is finally over. ------ ShannonAlther [https://www.gwern.net](https://www.gwern.net) Gwern is an independent researcher who studies... a lot of things. He's documented some of the history of the dark web, blogs about his nootropics experience and, perhaps most notably, predicted that bitcoin might reach $10,000... in 2011. [https://thelastpsychiatrist.com](https://thelastpsychiatrist.com) The blogs that aspire to follow in TLP's footsteps are legion. ------ ianmcgowan [https://www.ribbonfarm.com/2009/10/07/the-gervais- principle-...](https://www.ribbonfarm.com/2009/10/07/the-gervais-principle-or- the-office-according-to-the-office/) Reading this in middle management at a bank was illuminating. ~~~ matt_s I read through those while working at a Fortune 500 company. It sounded so close to truth, like the author was observing the same interactions I saw. Turns out he was at the same company at one point. The fact that those stand the test of time and resonate with so many people working in large organizations really helps with perspective. ------ exolymph Slate Star Codex is a foundational blog for me. "I Can Tolerate Anything Except the Outgroup" is a good place to start: [https://slatestarcodex.com/2014/09/30/i-can-tolerate- anythin...](https://slatestarcodex.com/2014/09/30/i-can-tolerate-anything- except-the-outgroup/) Then the rest of the best-of: [https://slatestarcodex.com/about/](https://slatestarcodex.com/about/) Lately I've been obsessed with Samzdat on Seeing Like a State: [https://samzdat.com/2017/05/22/man-as-a-rationalist- animal/](https://samzdat.com/2017/05/22/man-as-a-rationalist-animal/) ------ 1bm Interfluidity, for instance this [https://www.interfluidity.com/v2/3487.html](https://www.interfluidity.com/v2/3487.html) ------ gt2 [https://dailystoic.com/seneca/](https://dailystoic.com/seneca/) ------ JBReefer The Truth About Cars. Product Management, strategy, and product-market-fit are not isolated to software, and we're frankly not very good at them compared to 100 year old companies. Watching Tesla show the faults in the SV way of doing things has been really fascinating, for example. ------ praeconium maybe subjective, but its pure gold.. [https://scottlocklin.wordpress.com/](https://scottlocklin.wordpress.com/) ------ soared Simo Ahava is an absolute legend for anything google tag manager / google analytics related. No other site is even remotely close. [https://www.simoahava.com/](https://www.simoahava.com/)
{ "pile_set_name": "HackerNews" }
Advanced Python Features - federicoponzi https://tech.io/playgrounds/500/advanced-python-features ====== eindiran Home-brewed context managers are new to me: I had only encountered using them in the standard "with open(...) as f:" case. Can someone who has used them before point out a use case for writing your own? ------ reacharavindh I get a text blank page when trying to read on the phone.
{ "pile_set_name": "HackerNews" }
PSn00bSDK – An open source PS1 SDK that doesn't suck - Mizza https://www.youtube.com/watch?v=E5A96-pRF2w ====== Mizza Source here: [https://github.com/lameguy64/psn00bsdk](https://github.com/lameguy64/psn00bsdk) Since the N64 programming guide showed up on the front page, here's a good PSX SDK that I stumbled upon the other day.
{ "pile_set_name": "HackerNews" }
Update to R requires re-installation of packages - gk1 https://www.dominodatalab.com/blog/significant-update-to-r-requires-re-installation-of-packages/ ====== PaulHoule Repeatable work means understanding your inputs and outputs. If you just slap a brand name on it without that understanding you've just added another black box. Maybe I'm hard to convince because I've worked in this space, but the first thing I think is that this is "Anaconda 2.0"
{ "pile_set_name": "HackerNews" }
Ask HN: Startup, Project, or something else ... - RiderOfGiraffes Here's a thing ...<p>When I've asked people what the difference is between a restaurant and a cafeteria, the main points were that in a cafeteria you get your food from a central location, pay for it in advance, and take it to your table, whereas in a restaurant you sit at your table, order your food, it's brought to you, and you pay afterwards.<p>Yes?<p>So why do MacDonalds call themselves a restaurant?<p>In a similar vein, I'm working on a web site. What criteria should I use to decide whether to call it a start-up, a project, or something else. ====== yan In my experience, it mostly depends on the amount of the start-up kool-aid the author consumed (or 'founder') and rarely so the nature of the actual project. Thus, you get some people hosting 'sites' that are very profitabl, self- sustaining and add value to society, and others creating tiny twitter apps (twart ups!) and calling themselves start-ups. ~~~ poppysan mmmm start-up kool aid.... Im drowning in it at my start-up. ------ noodle you serve yourself in a cafeteria. you have a server in a restaurant. you have a project until you form a company. then, you have a start up. ~~~ RiderOfGiraffes What if I form a company, but it has no employees, it doesn't turn a profit, I'm running it in my spare time, and I don't really see how I can turn it into a full-time profitable venture. Is that a start-up, a side-line, or a project? ~~~ noodle yes. you can still theoretically call it a startup. i just wouldn't waggle that tag around too much until you have something to back it up.
{ "pile_set_name": "HackerNews" }
Ask HN: What successful "tech" companies ($50mm+) have non-technical founders? - miller4191 It's hard to define "tech company," but perhaps the majority of the company's business coming through online / software channels. Examples include: BirchBox, Warby Parker, etc. ====== kurtvarner Etsy - Robert Kalin Groupon - Andrew Mason LivingSocial - Tim O'Shaughnessy Fab - Jason Goldberg Mahalo - Jason Calacanis Gilt Group - Kevin Ryan ShoeDazzle - Brian Lee LegalZoom - Brian Lee ...But yes, you should still learn to code. ~~~ ifearthenight And sadly the second one on your list is not only non-technical but also non- business mathematics savvy :) [http://money.cnn.com/2011/09/23/technology/groupon_revenue/i...](http://money.cnn.com/2011/09/23/technology/groupon_revenue/index.htm) ------ philipdlang Would Zappos count?
{ "pile_set_name": "HackerNews" }
How do you archive, deduplicate and search e-mails offline? - youseecomrade I have thousands of emails from multiple accounts. Some of them are .eml and others are entire .mbox mailboxes.<p>I would like to be able to search them and also import new emails (manually, I&#x27;m not going to connect to IMAP or anything else). It would need to ignore duplicated e-mails e.g. re-import a Google Takeout .mbox without creating complete chaos.<p>- [email protected] -- inbox -- sent -- folder1 -- folder2<p>- [email protected] -- inbox -- sent -- thingsishouldnthavesent ====== Alex3917 You can search them using Stanford's ePadd software. edit: For deduplication you can use Aid4Mail, MailStore, or BitCurator. ------ dredmorbius Mutt. Notmuch. Offlineimap.
{ "pile_set_name": "HackerNews" }
Could Bitcoin Be More Disruptive than the Internet? - freework http://www.foxycart.com/blog/could-bitcoin-be-more-disruptive-than-the-internet#.UoGhoZR4aMQ ====== lukestokes Hey all. This is my first Hacker News comment and the first time one of my posts got shared. Thanks for that. Sorry the title is lame, but I went with something which would get attention at BarCamp Nashville. Not all of them are tech savvy, so the title served its purpose and got the talk selected. Moving on from the title, what do you think of the content? There are a lot of Bitcoin haters out there, but also a growing group of supporters. From what I've seen, few informed people argue against the actual technology while some argue about the politics or economics. To my knowledge, we've never seen a deflationary currency (outside of some babysitter experiment). We've never seen a store of value with no third party risk which can be transferred anywhere in the world immediately. As for wasted energy concerns, my understanding is the system is more efficient than the overhead associated with ACH, Credit Card, PayPal or other networks for securing transactions. Either way, curious what you all think about the content. As this is my first HN comment ever, please, be nice. :) ------ jacques_chester This is one of the more egregious Betteridge's Law violations I've seen today. ~~~ skrebbel For the slightly less well-informed: [http://en.wikipedia.org/wiki/Betteridge's_law_of_headlines](http://en.wikipedia.org/wiki/Betteridge's_law_of_headlines). Notably, 'Betteridge's law of headlines is an adage that states, "Any headline which ends in a question mark can be answered by the word no."' Also, for the non-native speakers: [http://en.wiktionary.org/wiki/egregious](http://en.wiktionary.org/wiki/egregious), 'reflecting the positive connotations of "standing out from the flock"'. (edit: removed bitterness) ~~~ jacques_chester "As smart" isn't fair. "As smug" or "as familiar with a nominal rule-of-thumb" would work better. Edit: It always gets nominated for hyperbolic headlines. I just happened to be the first on the scene. ~~~ skrebbel Agree. Took all the bitterness out. ------ wonkus No. Normals are easily pwned. You remember all those botnets that steal normals' internets? They will now stead their bitcoin wallet instead. Normals will lose their bitcoin to theft and never trust it again. End of story. The only thing you accomplish with bitcoin is wasting electricity and making global warming worse. Please stop destroying the planet. ~~~ eof "end of story" so so silly. this thing is 5 years old and tens of millions of dollars of wallets have been stolen. the story is currently peaking, not ending. the real way the story will play out is that solutions will come into existence so the 'normals' don't get pwned as much. ~~~ glesica You mean like currency that is centrally controlled and physical and can be kept nice and safe in a bank? Oh wait, already have that, people call it "money". Most people don't want to live in the wild west, that is a fact, and that's why the wild west went out of existence. There will always be some people who want to live dangerously / adventurously, but their experiences and preferences do not generalize to the entire population. ~~~ eof the problem with your counter argument is that it ignores the basic, undeniable fact: what you are referring to as "money", presumably government fiat: euro, dollar, yen, etc.. has certain flaws that bitcoin does not (not to say bitcoin doesn't have it own, different flaws), namely: government fiat, historically, has always been deflated; with the vast majority of fiat currencies eventually becoming worthless due to the government abusing the currency or losing their power; and, contemporary "money" is subject to restrictions, capital control laws, etc.. which _some_ people find to be too cumbersome. bitcoin has answer for these undeniable flaws; and besides.. no one is saying that "the entire population" needs to do anything. bitcoin can (and at this point, almost certainly will) exist right along side all these "real" monies you refer to; and people can choose, or not, to use bitcoin (and part of what is appealing to me, and likely others.. is that regardless of what any laws say, I can still "choose" to use bitcoin even if some jurisdiction delclares it illegal--unlike fiat.. where if the government decides i shouldn't have access to my money they can simply shut me down with zero recourse if the courts (or despots, depending on jurisdiction) don't see my side. ~~~ MaysonL Whereas bitcoin is computational fiat money. ~~~ eof i'm not sure fiat means what you think it means ------ pmarca Ladies and gentlemen, Hacker News 2013. Standing by to shit on every provocative idea microseconds after posting! ~~~ jacques_chester Enthusiastic promoters inevitably breed equally enthusiastic critics. ~~~ pg You could go beyond inevitably and say reflexively. Which is exactly the problem. ~~~ jacques_chester > _Which is exactly the problem._ I don't follow you. ------ everettForth Today's xkcd is relevant: [http://xkcd.com/1289/](http://xkcd.com/1289/) ~~~ oleganza Except Bitcoin could potentially remove a lot of war from the Earth. If it becomes the world currency, it'll kill "deficit spending" which is used to move billions of dollars for production of jets and bombs. ------ antonius If anything, the web provides a strong medium for bitcoin and this helps with the buying/selling process. Bitcoin wouldn't survive without the major benefits the web provides it. ~~~ hkmurakami Reminds me of people who say the internet is more disruptive than personal computers, which in turn are more disruptive than microprocessors :P ------ csense Claim: The answer is no. Proof: Bitcoin requires the Internet to function. Therefore, the things disrupted by Bitcoin are a subset of the things disrupted by the Internet. The Internet can be used for disruptive things other than Bitcoin, so the things disrupted by Bitcoin are a _proper_ subset of the things disrupted by the Internet. Since the total number of things in both categories is finite, this means fewer things are disrupted by Bitcoin than are disrupted by the Internet. QED. ~~~ eof I originally thought this; but ultimately have to disagree. Several problems: 1\. "the internet" is a pretty nebulous term. bitcoin certainly requires computer networks, which generally require tcp/ip; but bitcoin could technically function on a darknet, which could arguably be "not the internet". 2\. the internet relies on the power grid, which itself relies on copper wire; which essentially, and eventually, says, by your logic, that the internet is a subset of the things disrupted by copper mining. now i actually agree that in the end the internet is a bigger disruptor; but i don't think your proof holds. ~~~ csense The Internet could have happened on a planet without copper. Could Bitcoin have happened on a planet without the internet? ------ l0gicpath It's certainly disrupting a lot of people's peaceful nights with all the stories that are showing up of exchanges disappearing or supposedly being hacked. But then again, I'd presume given the nature of its economy, you'd have to be a fool if you didn't consider these outcomes possible. ------ adrianwaj The correct heading would be: "Could Bitcoin be the most disruptive thing to arrive on the internet since the web?" Rather than look at bitcoin, best to look at the websites that it brings about, and the ones it minimizes -- especially the ones that don't embrace bitcoins, such as banks. You could also measure it as a proportion of internet traffic vs web traffic. The key is that websites use bitcoins, and to take that into account. ------ jedunnigan I saw this earlier, I still don't get the logic of the conclusion. Can you really qualify technology B (Bitcoin) as being more disruptive then technology A (internet) when B requires A to operate? Maybe if you look at the technologies in the vacuum in which they were created, but in reality you'd want to contextualize them as extensions of one another, otherwise you get a biased picture of their disruptivity. edit:sp ~~~ mtgx You make a great point, which is probably true. By that logic the Internet is also less a revolution than transistors, which I guess it is. Unless we talk about "hidden" vs "obvious" revolutions. Because people don't really know about "transistors", but they do about the Internet, and "use it" every day knowing they do so. So in a way, you could use the argument that the Internet is a more obvious and practical revolution (closer to the end user). Bitcoin could be the same vs the Internet, although I don't think we really need to qualify them like that. They could both be very important. ~~~ jacques_chester I suppose this means that the question is whether you pay kudos to the immediate thing or to its necessary causes. We can follow this line back to arguing that nothing is more disruptive than the Big Bang. ~~~ jedunnigan >We can follow this line back to arguing that nothing is more disruptive than the Big Bang. Well yes, but the Big Bang was not a discovery/invention made by man. Discovery of how to start a fire would probably be a good candidate for the most disruptive tech. ~~~ jacques_chester Part of the difficulty is that many discoveries and inventions reach the threshold of being necessary causes of the modern world. Once you pass that threshold, how do you rank them? The removal of _any_ of them (eg fire, the wheel, mathematics, steel, steam ...) renders the current world impossible. Are there degrees of impossibility? I'm not sure. We may be looking at a partially-ordered set here. ------ Zarathust Internet allowed the creation of things such as Bitcoin. Btc is only making the Internet impact larger, if such a thing is ever measured and matters at all. ------ Fomite No. ------ a3voices More disruptive than the Internet? No. More disruptive than Facebook? Yes.
{ "pile_set_name": "HackerNews" }
Brazil loses second health minister in less than a month as Covid-19 deaths rise - spacial https://www.theguardian.com/world/2020/may/15/brazil-health-minister-nelson-teich-resigns ====== spacial Brazil is fighting two virus simultaneously :˜(
{ "pile_set_name": "HackerNews" }
Blind Signature - rfreytag https://en.wikipedia.org/wiki/Blind_signature ====== nullc The "blind ECDSA" in that article is not an ECDSA signature at all. It's a blind schnorr signature. The article does not mention that the construction there (and ones like it) are vulnerable to a one-more-signature attack if parallel queries are possible-- which is easily a fatal vulnerability in the headline digital cash application for blind signatures. ~~~ corndoge Please update the Wikipedia article! ------ rusbus It occurred to be that with 2 factor authentication, the server needs to hold on the the secret key. If that is leaked, the 2 factor is useless.[1] Contrast this with salted hashing where leaking the hash doesn't directly lead to compromise. Could you use a blind signature to construct a rotating code 2 factor scheme where leaking the server credential doesn't lead to complete compromise? [https://rcoh.me/posts/two-factor-auth](https://rcoh.me/posts/two-factor-auth) ~~~ tptacek The reason a leaked password is devastating is that passwords are reused, both on the site itself (people will select variants of their old passwords when changes are demanded, and eventually rotate back to the original), and, more importantly, on other sites. This means that a compromise of a database of passwords endangers not just that site but many other unrelated sites. TOTP and U2F keys aren't cross-site, and they don't occupy any space in the user's brain. They can be treated as ephemeral and effaced and regenerated. So we're not generally that alarmed at the prospect of them appearing on Pastebin, modulo of course the implications that has for the rest of our security. Almost every realistic scenario in which TOTP/U2F keys are leaked implies that all serverside security has been lost anyways, so protecting the keys themselves is a secondary concern. Any reasonable site operator would invalidate all stored keys after such a compromise. By the way: salting hashes doesn't really protect against password recovery. In modern systems, it's "stretching" that does the real work. You should be using a modern password hash like bcrypt or scrypt, not simply randomizing a standard cryptographic hash. ~~~ tialaramex I don't agree with the de-emphasis of salting. Salt eradicates time-space tradeoffs. Any successful scheme is going to get used by lots of people which means a time-space tradeoff would be hugely valuable even if you have GPUs or even ASICs on your side. Even an eyeball test shows the value of salt. Here are 4 million usernames and hashed passwords. Oh, 86000 of them have the same hash. Ok, those 86000 users have some easy to guess password, we'll break that first. ~~~ Tomte It doesn't really eradicate time-space tradeoffs. It only mildly alleviates them. Please look up bcrypt and why it's better than a simple salt. Noone proposed getting rid of salting without another measure. And you don't "break a password first". You brute-force and compare against all of the hashes.
{ "pile_set_name": "HackerNews" }
Amazon stops hosting WikiLeaks site - jbrodley http://www.reuters.com/article/idUSTRE6B05EK20101201 ====== RiderOfGiraffes Choose your news source: <http://news.ycombinator.com/item?id=1959697> \- techdirt.com <http://news.ycombinator.com/item?id=1959655> \- cnn.com <http://news.ycombinator.com/item?id=1959633> \- arstechnica.com <http://news.ycombinator.com/item?id=1959607> \- bgr.com <http://news.ycombinator.com/item?id=1959335> \- npr.org <http://news.ycombinator.com/item?id=1959328> \- guardian.co.uk <http://news.ycombinator.com/item?id=1959308> \- readwriteweb.com <http://news.ycombinator.com/item?id=1959305> \- reuters.com <http://news.ycombinator.com/item?id=1959257> \- techcrunch.com <http://news.ycombinator.com/item?id=1959142> \- foxnews.com ------ jdp23 An great example how a lot of "political" stories are hugely relevant on tech startups. Anybody who's thinking of hosting controversial content on AWS or other Amazon properties has to be looking at this closely. ------ pinksoda That didn't last very long eh?
{ "pile_set_name": "HackerNews" }
SocialMovie - Facebook Open Graph Ruby on Rails Sample Application - lutfidemirci https://github.com/lutfidemirci/socialmovie ====== username3 Demo? Screenshots? ~~~ lutfidemirci You can check the app <http://socialmovie.herokuapp.com/> ~~~ username3 I don't want to login.
{ "pile_set_name": "HackerNews" }
Can we reverse the ageing process by putting young blood into older people? - sergeant3 http://www.theguardian.com/science/2015/aug/04/can-we-reverse-ageing-process-young-blood-older-people ======
{ "pile_set_name": "HackerNews" }
Berkley Astronomer Geoff Marcy Violated Sexual Harassment Policies - BDGC http://www.buzzfeed.com/azeenghorayshi/famous-astronomer-allegedly-sexually-harassed-students ====== ngoldbaum This is absolutely disgusting. I heard rumors about exactly this for years as a grad student, it's sickening that things are only being done about it now, and even then no repercussions of consequence will be visited on him. Does Berkeley really see the chance of an exoplanet Nobel prize to be more important than the safety of students and staff? ~~~ nikolay "Safety", really?! There always would be rumours. ~~~ ngoldbaum Except now an investigation has substantiated these rumors and Prof. Marcy has issued a public apology. These are real incidents. And yet, he is still allowed to take on students. This is insane. ~~~ nikolay My point is that "safety" is a groundless exaggeration. As I've passed through Sexual Harassment training several times already, even a look could be a harassment. To conclude that a scientist could be putting female students' safety at risk is both statistically and morally incorrect. ------ univalent Very depressing. Fire him now.(Although knowing Janet N they will likely botch this as well). As a former Cal Bear, we don't need Nobel laureates as much as we need to preserve Cal's moral reputation.
{ "pile_set_name": "HackerNews" }
Is AI Ready to Prevent School Shootings? - dsr12 https://www.theatlantic.com/technology/archive/2018/08/is-ai-ready-to-prevent-school-shootings/567035/?single_page=true ====== api No. Don't even need to read the article. You're looking for a very tiny needle in an enormous haystack. You're going to get flooded by false positives. If you turn up the filter enough to filter out false positives you'll miss real ones. If you investigate every false positive you'll create a civil liberties controversy plus you will get so fatigued investigating false positives you'll once again miss real ones. "AI" is not magic secret sauce that lets you escape the fundamental limits of statistics and information theory. Claude Shannon would like a word with you. The only solution to the school shooting problem is to ask why this is only a big thing in the USA. I think it's a mixture of culture, ready access to guns, and an awful mental health system (and health care system in general).
{ "pile_set_name": "HackerNews" }
Please Learn to Code - sgdesign http://sachagreif.com/please-learn-to-code/ ====== VolatileVoid I'm actually kind of tired of hearing, "there's no other field that lets you create something from nothing" vis a vis programming. Look: I LOVE coding. I LOVE programming. I think it's fun and challenging, etc. etc. But it's not the only creative outlet that has ever existed. Have you considered a pencil and a paper? Crayons? Markers? Paints and a canvas? A hammer, some wood and some nails? "Ah, but that requires something to create something!" Yeah? So does programming. It requires a computer (or access to one). It requires programs to run your code - free or otherwise. In fact, I'd argue the barrier for entry into programming is SIGNIFICANTLY HIGHER than MANY other creative outlets. ~~~ groovy2shoes > It requires a computer (or access to one). It requires programs to run your > code - free or otherwise. I know what you're saying, but I'd like to point out that you don't _need_ a computer to _write_ programs. I started writing programs with pen and paper back in middle school (before my family owned a computer), and still often do that. Nowadays, most of them do end up being digitized and executed, though. Alonzo Church, Alan Turing, Haskell Curry, Moses Schoenfinkel, etc., were writing programs before computers were even invented ;) ~~~ scoot And Ada Lovelace before electricity. :) _"The notes [...] include in complete detail, a method for calculating a sequence of Bernoulli numbers [...] which would have run correctly had the Analytical Engine been built. Based on this work, Lovelace is now widely credited with being the first computer programmer and her method is recognised as the world's first computer programme."[1]_ [1]<http://en.wikipedia.org/wiki/Ada_Lovelace> ~~~ groovy2shoes _How_ could I forget Ada Lovelace? Facepalm moment. Thanks. ------ edw519 _Think about it: something I did reached 10,000 actual living people and had an impact (however small) on their life. That would never have been possible if I didn’t know how to code._ (ability to code) != (eyeballs reached) I've written blog posts viewed by 100,000 people in one day. I've also written software used by 1 person to save millions of dollars. Both are rewarding, but I'd still rather build something. I think _that's_ the most appropriate metric. ~~~ danso _(ability to code) != (eyeballs reached)_ Not directly, maybe. But having some acknowledgment of basic data principles (particularly delimitation, meta-ness, maybe even regexes) can substantially increase your ability to write digitally. Markdown, for instance. And of course, being able to maintain or customize an existing platform, whether it's WordPress or Tumblr. It's sad to say, but delimitation is __not __a skill that is in the mind of the average professional adult. Try sending someone a tab-delimited text file to someone who has only known CSV or XLS sometime. ~~~ roc I'd agree that there are computer skills that are generally useful, up there with reading, writing and math. But they're far less complex than _code_. And I can't say I've ever heard the "learn to code" people advocating them. ~~~ danso As a person in the media/reporting industry, I've almost committed to making it my life mission to teach people the usefulness of basic regexes. I've taught a lot of "learn to code" sessions...My main goal is to not for them to remember the specific syntax, but that programming gives you the ability to repeat a task thousands of times (for loops) and differentiate between them (if statements)...how many (non-worker-drones) people would be content spending significant amounts of their dayjob time doing manual copying-and-pasting and click-series-of-links-to-download-reports if they were aware of these basic coding constructs? ------ blindhippo I remember when VBA came out in MS Office. Now your average tech savvy office drone could morph their excel spreadsheets into a full blown application capable of processing business data without having to jump through any of the traditional hoops the evil IT department normally demands. And we ended up with a ton of buggy, dangerous "mission critical" piles of garbage because they were "designed" and built by non-programmers. Software programming is a discipline on the order of engineering and it will continue to get more complicated and require more and more education going forward. So no, not everyone should learn to "code". We should be encouraging people to THINK like a coder - to approach problems in a way to identifies root causes and starts coming up with proper solutions. I'm not in the valley so may it's vastly different down there, but outside (in Canada) tech is still considered a very specialized field, software development especially. It is not considered as simple as basic household skills like plumbing, cooking and building a deck. I would not expect my lawyer to 1) know how to code or, 2) code in a professional and useful manner. ~~~ kalleboo I still think those «buggy, dangerous "mission critical" piles of garbage» are valuable. If there was no VBA, the jobs these scripts were created for would still be done manually, with a mouse, by some secretary somewhere. "Getting a professional to do it" would never enter into the picture when you're talking about a BigCorp with a conservative, limited-budget IT department. ~~~ pcroom During my internship at a manufacturing company, I knew enough about VBA to record a macro in Excel (just record, no actual coding involved) to automate ~10 minutes of work I had to complete monthly and included step-by-step instructions for how to run the macro and what the macro did. When I came back a year later after graduating, I learned the process was again being done manually because my replacement did not have even a basic understanding of how to interpret what a macro does from the script. A few months ago, I automated ~6 hours of monthly Excel work through ~4 hours of trial and error of recording macros and manually editing them. I have zero training in coding other than osmosis from my brother who is an iOS programmer, but I wasn't afraid to screw up enough to figure it out. Based on my first experience, unless my eventual replacement has at least a curiosity about how macros work I wouldn't trust them to effectively run my script over the long term--if anything breaks, they won't be able to fix it. The point is, a lack of basic, basic understanding of coding could end up costing the company the 75+ hours/year I was able to save with a "buggy, dangerous...pile of garbage" when I either move to a new role or leave. This was something which would never have been a high enough priority for our coders to write for us. Multiply that through 25 people in our finance department, and you're talking about needing an additional full-time employee. Can anyone recommend a good place to learn enough VBA to move past the buggy and dangerous stage? ~~~ sopooneo You hit on an important point: curiosity. Most of the best programmers are deeply curious, and I don't think that can be taught, though I do believe it can be cultivated from an early age. On the other hand, showing people how to safely experiment in ways they can be sure won't screw things up may be very empowering and allow them to express their creative drive. ~~~ sigkill I kind of agree with you. I think everyone is inherently curious right until the point when a smart-ass parent says "Because I said so, and now shut up and don't ask questions." My parents did not do that. Even today, they pride themselves on the fact that when I used to asked them a "What would happen if..." question, they'd be like "Why don't you try it out". Sure I may/may not have got myself in a sticky situation, but the point is they didn't shut me down. Also, if I ever broke something (which was fixable) my dad would actually open that damn thing and fix it during the weekends, talking and explaining to me what the things inside do. Sure I may be only 7-8 or even 10-11 but hey that adds more to my curiosity. At their angriest, my parents have assessed the situation to check if it's fixable by a human at my age/capacity and given me a stern "fix it" look. Boom, that was a blessing in disguise as well because now I'd be all like "Shit, how DOES this work". [Note: I'm neither married/not have kids...yet] The problem that I've seen with others/their parents/their children is that they get annoyed quite quickly. And when you shut down a 5 year old, you can clearly see the pained expression on his face. They simply lose interest in everything and end up becoming drones. And they're afraid of doing anything new because they worry if they screw it up, their parent will come home and beat them up. ------ jenius Sacha, I am a fan of yours and usually like your material, but I just can't agree here. You took so much for granted in this post. Deploying a website that 10,000 people will see is not trivial, whether you personally did it or not. Here's my beef, specifically: 1\. You already knew how to design. In fact you are a very talented and well known designer. If you didn't have design skills, chances are a huge huge amount lower that anything you put out will make the rounds - design is super important, and you of all people know that. And that's something you need to learn as well, so tack that on top of the time it takes to learn how to code. 2\. Learning how to code != making websites. In fact, they are very different. To make a website, not only do you need to either have a designer on hand or be good at design as mentioned above, but you also need to have a very good understanding of how the web works. That means filesystem structure, http, ftp, domain names, web hosting, then add html and css to the javascript you've been working on. THEN if you want your site to be anything more than static, throw in back end code, databases, web frameworks, etc. on top of that. This is a MASSIVE stack of things to learn, and I don't know a single person who knows all of this and doesn't work full time doing this stuff. "Learning to code" seems like a cute thing you can make a resolution to work on as a hobby, and maybe it is. You might be able to pick up the bare basics of programming in your free time, if you work hard. But making something significant like you claim here is a completely different deal. 3\. You never dispute or even address the main point behind Atwood's article - that while programming is great to learn, the trend that 'everyone should learn to program' as a base skill along with reading, writing, and math is ridiculous. It's nowhere near as important as these other skills, and he makes a number of other valid points as well which I'm sure you don't disagree with. You argument was simply that programming is cool, so you should learn it. Sure, I agree, and I would have said the same if I only read the title of Atwood's article. But I feel like the way you contested it was uninformed and completely missed the point. ~~~ agentultra _3\. You never dispute or even address the main point behind Atwood's article - that while programming is great to learn, the trend that 'everyone should learn to program' as a base skill along with reading, writing, and math is ridiculous. It's nowhere near as important as these other skills, and he makes a number of other valid points as well which I'm sure you don't disagree with. You argument was simply that programming is cool, so you should learn it. Sure, I agree, and I would have said the same if I only read the title of Atwood's article. But I feel like the way you contested it was uninformed and completely missed the point._ I'll dispute the main point. As computers permeate more of our everyday lives it becomes necessary to understand them on some level. Nobody needed to learn how to read until there were books everywhere. No one needs to be an English major to pick up a book and learn to read. The benefits we know are enormous to adopting this skill. But lets consider a world where books were everywhere but only an elite few felt it was useful to teach people to read. Knowledge as we well know is power and if only a few had the ability to pass on their knowledge amongst themselves then they would have a significant advantage over those who cannot read. The unfortunate masses who could not read would never know the full extent of the forces that work against their best interests. How could they? Now we're in a world where there is more computing power in your pocket than that which sent human beings to the moon and back. Yet the non-programmer has no idea just how useful it is. The only literacy they have with computers are as appliances. They don't realize that these devices carry with them processes that may or may not operate in their best interest and they have no way of knowing that without being able to at least have a basic literacy of computers and programming. Becoming a master at programming is still a difficult task that few will achieve. Just as becoming the next Nabokov or Salinger won't come to every person who picks up a pen and paper. However that doesn't mean we don't need to teach everyone how to write. We give them the basics and its up to them to use those tools if they so choose and pursue their own paths. However if we keep them in the dark then they'll have no hope. Teach everyone to code. Computing is emerging as a new medium of expression and the technology is embedding itself in our every day existence. People need to be literate so they are able to understand the consequences and benefits of this technology. It's 2012 and most people I know still think computers are practically magic. They should know better and its our fault for not educating them. ~~~ cageface _As computers permeate more of our everyday lives it becomes necessary to understand them on some level._ I vehemently disagree on this. Apple has made a silly fortune proving the contrary. If you need deep understanding of a device to make it useful then the designers have failed to provide you with the right abstractions. ~~~ larrys Agree. And along the same lines an entire industry grew around the fact that computing products before Apple (particularly DOS, Windows etc.) were difficult enough that the average person (say someone who might be urged to code now) did not have the desire or aptitude to learn how to do it themselves. I'm sure everyone knows many of these people who giggle about how stupid they are with computers and have no interest in learning how to fix the simplest of problems. Some people just lack the appropriate base skills to fool with computers and certainly programming. They will always need a "tech guy" and have no desire to change that for a reason. It's not that simple to them. Not because they are stupid but because it's not their thing. ------ babarock I expected a similar post to hit HN, since I too was shocked by Jeff's comments. However I do not think people should learn to code for the same reason. Sure writing a website is cool, but then again, learning plumbing is cool, which is the basis of Jeff's argumentation. Plumbing can empower you by giving you control over your own house and appliances, etc, etc. I think the real reason people should learn to code, is because an increasing number will have to deal with machines in every day lives, often taking decision affecting the work (and general lives) of others based on their understanding of those same machines: \- The legislator who has to pass laws about computers and/or the Internet. \- The manager who has to assess the usefulness of a new software application. \- The teacher educating kids and preparing them for the modern world. \- The consumer who wants to make an informed choice when chosing the latest gadget, not blindly follow what marketing departments tell her to. \- The judge and jurors overseeing the Oracle-vs-Google case. I want to make the distinction that I wish people would learn to Code, not so they can "make" stuff, but so they can "understand" stuff. ~~~ sgdesign I don't think the comparison is apt, because learning plumbing only helps you with plumbing. Whereas coding knowledge can be applied to a lot of fields. To me, there's an argument to be made that coding can be considered (or maybe, will one day be considered) like a life skill on par with reading, cooking, or playing music. I agree it doesn't look like it right now, but I'm sure that a couple hundreds years ago the idea that everybody would one day know how to read seemed just as ludicrous. ~~~ babarock I am absolutely not arguing agains this. Here's the extract from your text I disagree with: > I can’t think of many other skills that enable you to create something from scratch and reach as many people as knowing how to set up a simple website. > Just last week, I was able to come up with an idea and then launch a site in 2 days. That site was then seen by about 10,000 people in a couple hours. > Think about it: something I did reached 10,000 actual living people and had an impact (however small) on their life. That would never have been possible if I didn’t know how to code. This text is arguing that the value of learning programming is in the software you will _create_ and the impact it will have on a population. This would be a valid point when arguing why one should become a programmer. Drawing on an analogy similar to the one you make: Learning to write is crucial today, even though I will probably never be a published author. Similarly, learning to code is important, but not because "you too, can make a website!". ~~~ sgdesign Well, I think both point are valid. You should learn to code as you learned to write, to be able to function in a modern society. And another good reason for learning to code (just like learning to write) is that it'll let you reach a lot of people, and possibly become extremely rich in the process. I guess that second argument is what Jeff Atwood is mostly disagreeing with. ------ ralfd I think everybody should visit Reddit, for one simple reason: _knowing how to make a meme is hugely empowering._ I can’t think of many other skills that enable you to create something from scratch and reach as many people as knowing how to set up a simple rage post. Just last week, I was able to come up with an idea and then photographed my cat in 2 ways. That photo was then seen by about 10,000 people in a couple hours and I got 2000 karma points. Think about it: something I did reached 10,000 actual living people and had an impact (however small) on their life. That would never have been possible if I didn’t know how to procrastinate on the Internet. ~~~ bcjordan Your point in the wider discussion aside (sure, population-wide imperatives are a dangerous rhetorical device), I actually think actively participating in the Reddit rage comic community can be a very valuable experience. It will teach you how to tell a story and communicate the humor of a situation using a simple image editor. You can watch the real failure-learning happen in [1]. Thinking your work is comedy gold and getting it downvoted into oblivion is a great, highly-concentrated learning experience. English teachers in Japan are also using rage comics to supplement their English lessons and inspire their students to learn words so they can understand their classmates' jokes [2]. [1]: <http://www.reddit.com/r/fffffffuuuuuuuuuuuu/new/> [2]: <http://www.reddit.com/r/EFLcomics> ~~~ ralfd I love the non-sensical strangeness of this: <http://i.imgur.com/Lf6A8.png> And that on the Reddit comments the ordering of the panels is debated. ------ phatboyslim Computing is taking a similar path in everyday life as finance. As individuals, I don't feel we need to learn how hedge funds, or complex derivatives work, but nobody denies it is beneficial to learn how to manage money, make basic investments, and plan your retirement. The urge to teach programming isn't as much about teaching everyone complex and theoretical computer science, but to teach basics that will benefit them as we move into the information age where technology permeates nearly every business. ------ photon137 I think everybody should learn to code _at school_ when they're kids. That way, they can decide if they want to become real programmers/software- engineers or something else. Just as every kid learns mathematics does not mean he/she has to become a full-fledged PhD researching manifolds in Topology. But they still need to know how to calculate percentages, basic statistics etc to get through life more easily. In the same way, giving instructions to a computer the "hard" way instead of via clicking on buttons and letting magic happen is often a good exercise to appreciate the power and freedom it gives you. I remember doing this when I was 9 or 10 and doing locate, print, cls repeatedly in a BASIC loop let me create an animation I could control quite precisely (well no, CPU cycles came into play!) But that's the precise reason, 18 years on, I delve into programming GPUs, wrote games with advanced Direct3D shaders for them and am currently wrting OpenCL code to solve complex equations on them. It's all because of the locate, print, cls loop! ------ tobias3 Discussions where one does not agree on the topic are always kind of pointless: You can view "coding" as... ...an extension of mathmatics -- The ability to express an algorithm in a way that a computer understands it. ...an engineering discipline, where you build complex products by appliying good practice. The first thing, can and should be tought at school. In fact where I live it is tought there. Needless to say it has the same reputation as math... The second thing is something you have to study and become an expert in, because if you are not companies loose money or you might even kill people. ------ kareemsabri "The first step is letting people know that learning to code is not that hard, and that if they put their mind to it they have a high chance of succeeding." Is this even true? I think it's pretty well documented that the vast majority of the population actually has a high chance of failing. Look at failure rates in intro CS courses. Remember this? [http://www.codinghorror.com/blog/2006/07/separating- programm...](http://www.codinghorror.com/blog/2006/07/separating-programming- sheep-from-non-programming-goats.html) ------ mns2 Both of these blog posts are stupid. They should both really be saying "Please learn more about the world in order to better accomplish your goals." Sometimes it's useful to learn how to code because it teaches a different way of thinking. Sometimes it's more useful to just learn how to think differently. These choices depend on a lot of things, but one is more optimal than the other for any particular scenario. If I could get everyone in the world to read a few articles on rationality, I would, because it wouldn't take long and the people who could understand it would really get something out of it. If I could get everyone in the world to learn how to code, I wouldn't, because it would take longer and I expect far fewer people to get value out of it. I can't do either of these though, so both are pretty useless to consider. People toss around information more and more quickly these days, so it might be useful to share a bit of philosophy. Tell people to try hard. Tell people to learn. Tell people to find out about the things they don't know, to see if they could help them. Tell people to reevaluate their goals, to find which contradict each other and to sort themselves out. Be kind or they won't listen. Don't be patronizing in your kindness, or they won't listen. If you want to help people become better, do it for yourself first. Learn more, become smarter, try harder, do better, and eventually the people around you just might see it and start doing it, too. Or maybe after a while you'll start helping other people become better. There are still people who suffer today. Sometimes you can't help them; sometimes you can. ------ tgrass I wonder how many folks suggesting everyone should learn to code will replace a punctured car tire instead of plugging it. ------ hesselink I think learning to cook is an interesting comparison. Would people react the way they do now if someone wanted to learn how to cook instead of learning how to code? I think learning to cook is a really worthwhile skill. It will teach you about food, ingredient and chemistry. It will maybe get you thinking about how we produce food in our world. The same goes for coding. It will get you thinking about a lot of things that really matter to your life and the world. ~~~ debacle Learning anything is a good comparison. There is no harm in learning. Better to be a mediocre guitar player than not be able to play to guitar at all. It expands your mind, and lord knows you were just going to spend the time watching Desperate Housewives reruns. ------ islon Programmers thinking their specific skills are universally useful and should be taught at school... Yeah right. Most people don't even remember the basic math they learnt at school. I wont say programming skills aren't useful and rad but one could say the same about basically every skill and I don't see electrical engineers saying "please learn electrical engineering, it's very useful". ------ k_kelly Programming is an adult way to treat a computer as opposed to the absolute infantilism most people approach their computers, or devices with. It's as much to say I understand that what happens on the hardware end is what is happening as the consumer, even though the processes seem completely different. And none of it is easy or obvious. Dijkstra said teaching computer science was absolute cruelty because it does not truly reflect anything else in the world. If you are going to work with computers don't think that an email server is an online mail room, don't go down the road of thinking that copyright law holds from one medium to the other because the idea behind the product is the same. Don't think that skype is a telephone, don't think that facebook is the beatles of today, do know what something is, do know that the only intersection between life and computers is via maths and logic, do understand that these things matter now and will matter in the future. ------ DennisP So we complain about politicians passing stupid technology laws, and as soon as a politicians says he wants to learn something about the tech, a prominent programmer tells him "don't bother?" If you don't know how to code, it probably seems perfectly reasonable to have a computer that you can't program yourself, with everything locked down. If that's the future you want, then sure, tell politicians to leave the coding to the professionals. If you can code, you start to see the computer as a machine that can do anything you want, instead of just the things some app store makes available to you. That freedom is addictive. You start demanding it. Cory Doctorow's fears about the end of general computing will come true unless lots of people get addicted to that freedom. <http://boingboing.net/2012/01/10/lockdown.html> ------ grovulent I'm going to leave this here - since I've been talking about this for a while. [http://reviewsindepth.com/2011/04/why-everyone-should- learn-...](http://reviewsindepth.com/2011/04/why-everyone-should-learn-to- program/) I really think the issue is deeper than current commentators are addressing. ------ pefavre Interesting point: people need to know how different it is to customize a WordPress theme and building a Rails CMS. If, before knowing how to actually code, people can judge of the quantity and quality of work required to build something, the world will certainly be better. ------ brudgers As I recall from the 1980's, there are two types of people in the world, programmers and victims. I think Atwood's point is valid in so far as the world probably doesn't need politicians coding the software by which government services are provided. Nor do we need everyone writing their own web pages, one GeoCities was enough. On the other hand, it seems to me that programming is going to increasingly be seen as a part of basic mathematical literacy. It's just a whole lot easier to solve a layered arithmetical problem with javascript than with a conventional calculator. In other words, most people should try to learn the lightweight scripting which allows for better exploitation of all the computing tools. ------ unreal37 A lot of people are confusing "learning how technology works" with "learning to code". I think society would benefit if more people understood how things worked. But not necessarily if everyone could code. Perhaps learning to code is a gateway to understanding technology. But that's a pretty steep learning curve just to explain to someone how a web page is generated and served. To keep the plumber analogy, I'd be a better homeowner if I understood what all the pipes in my house do, and the importance of proper care and maintenance. Or when something breaks, how to turn off the water main without calling 911. ------ __abc I can hear it now, "but hey, during code year I learned how to hello world, why is your estimate so hi, it's not that hard". We should REALLY do this with healthcare. Who needs doctors. All hail self diagnosis year 2013! ~~~ gk1 > We should REALLY do this with healthcare. Who needs doctors. All hail self > diagnosis year 2013! Did you ever take a biology class? ------ Osiris I think the discussion should not be that everyone _should_ learn to code but that everyone should have the _opportunity_ to learn to code, if they want. We shouldn't be arguing why we should or shouldn't learned coding over other skills, like plumbing. I praise people that put in work to make learning (such as coding) an easily accessible endeavor for those that want to learn. I would also praise Khan academy and universities that have made learning a variety of skills easily accessible to the people. As a society, let us push to make education and learning open and accessible to all. ~~~ sgdesign Very good point, I should've phrased it that way. But I think the first step in _giving someone the opportunity to code_ is to _make them want to try it_. So that was the intent behind me saying "everybody should learn how to code". ------ Czarnian Learning how to code has about as much real-world utility as learning how to rebuild an engine block. It's necessary to know if you're in the industry. It's interesting information if you like that sort of thing. For everyone else, it's a non-factor. They'll never be in a position where coding will solve a problem for them. Even if they are made aware of a potential solution involving code, they won't be bothered to try. Knowing how to change the oil in your car is a far cry from wanting to change the oil in your car. ------ elisk I agree more with this rather than the other one - I see coding as speaking, writing, cooking [food], or composing music. Sure, not everyone in the world HAS to know how to compose music, but we still know how to compose a simple melody in our heads, and we sure as hell need to know how to write. Coding isn't hard, coding is simple - simpler than a lot of other complicated things we have to learn as we grow up - and as such it should be mandatory in a world governed by computers. ------ trustfundbaby <sarcasm>I think everybody should learn to do basic medical procedures, for one simple reason: knowing how to do it is hugely empowering.</sarcasm> Quite a few people don't have the time or the inclination to sit down and learn how to code, and to keep making like its really simple is kind of silly. I took my first programming class in Java many years ago, and even on basic things like operators and variables, half of the class was completely lost. Not everybody can do what we do. ~~~ j_s Drifting off on your carefully-tagged tangent: Basic medical procedures like CPR, the Heimlich Maneuver, using an epipen, applying a tourniquet, etc. ? ~~~ objclxt It was tagged in sarcasm, but to expand upon the dodgy analogy couldn't you argue that's just the medical equivalent of basic computer literacy? ~~~ j_s Well, CPR certification requires some effort but I certainly couldn't argue the others on the same footing. ------ jwoah12 It seems to me like there are ulterior motives behind a lot of programmers' hostile attitudes toward non-programmers learning some of the craft. If I didn't know any better (and I don't), I'd say that people think, consciously or otherwise, that it will somehow cheapen their skills or experience if some newbies learn basic development. I disagree. I think that if anything, it will give others more of an understanding and appreciation for what we do. ~~~ cageface Not me. I'm just tired of once again seeing all the hoi polloi rushing in because some dumb photo sharing site just sold for a billion dollars and this is going to be a easier way to earn their boat and summer home than taking all those boring business, law and medicine classes. ~~~ jwoah12 I agree that people trying to learn in order to get rich quick is good for nobody (it's akin to the 100%+ increase in CS enrollment around 2000 that I remember some of my professors telling me about). That being said, Bloomberg's initiative to bring NYC up to speed with the technology through better education and a high-tech business environment certainly predates Instagram's sale. ------ sopooneo I used to teach physics and the kids who didn't love it would often ask why they had to learn it. I didn't give them any baloney. I admitted that the vast majority of them would never need it professionally. I did point out that logical thinking was important and physics would help them with that. But I think the truth is that science, like most other aspects of a good education in the "arts and sciences" is actually for the benefit of society as a whole. I would bring in articles from major News outlets and ask my kids to spot the blatant scientific errors. Do you know how many articles report things that clearly violate conservation of energy? It is so rare to see a science article in the mainstream press without errors, that I don't even get surprised anymore. Having people with basic science knowledge will allow them to be savvier consumers, better citizens, and, in those cases where their lives take them there, vastly better public servants. Knowledge about programming is not an exact parallel, but I think it's reasonably close. I agree with Atwood that programming is not as foundational as reading, writing, and math, but it's right up there with a few other things that would be very good to teach everyone the basics of. ~~~ platz relevant: <http://xkcd.com/1050/> "The only things you HAVE to know are how to make enough of a living to stay alive and how to get your taxes done. All the fun parts of life are optional." If we measured everything by how much we used it professionally, we wouldn't learn music, play sports, or any number of things we focus on in schools. ------ EternalFury If "learn to code" means "learn that coding is a laborious profession", then yes, by all means, learn to code. I am definitely wary of people who tell me: \- "Why don't we..." \- "We just need..." \- "How hard could this be..." \- "Quickly change it so that..." \- "We can always change it later if it doesn't pan out..." The disconnect between people who know how hard it can be and the people who assume it's as simple as talking about it is way too large as it is. ------ mattschoch For learning to code: -You develop analytical and problem solving skills. Similar to math, but more practical (how many people really use Calculus on a daily basis?) -You understand how computers work. No explanation necessary. -All the other good arguments people have made. Against learning to code: -More bad programmers. -Even worse, more non- programmers who think they can code. Example: Someone does CodeYear and suddenly thinks they are a expert coder. Starts web design business. Sells poorly built wordpress themes to unsuspecting small businesses. If they have low prices, it makes potential customers think that all coding should be cheap("if they can do it for practically free, why should I pay you more?"). Thus, the entire industry suffers. That isn't to say CodeYear is bad. I like it. It's great for people who actually want to learn to code and don't know where to start. It's bad for people who think they can sell websites after a few hours. ------ jwwest What's frightening about the nativity in Atwood's post is that his basic assumption is that the more people know the worse it is. This is in direct contradiction to the fact that generally speaking the more educated people are, the better and stronger our workforce, communities, nations, etc are. Learning to code != becoming a professional programmer. I can't count the number of times I wished that a graphic designer I was working with had even the basic idea of how HTML works. Non developers should learn at least a bit of appreciation and even a baseline knowledge of what we do if they work with us on a daily basis. Also, developers need to actually learn some basic design skills to enhance their communications as well. ------ WiseWeasel Maybe people don't necessarily need to learn a particular programming language, API set, platform, etc., but a basic awareness of how computers can be used to automate tasks, how loops and conditionals combined with arithmetic can be organized to make something useful, would really help people realize the potential benefit to be unleashed with an understanding of computers. I had a computer class in high school where we were taught BASIC, and we were eventually taught to use HyperCard in a likely atypical "humanities" class I took; those experiences were certainly very enriching and empowring to me, even if I never touched BASIC or HyperCard again since. Is programming even still a part of high school curricula? ------ S_A_P I think that there are many assumptions being made here. The biggest of which is that everyone is even _interested_ in writing code. I know many people whose eyes glaze over the second I mention the word code, and would consider it akin to water boarding to learn code. I don't know that everyone is even wired to write code. Its certainly a certain type of person that can abstract problems logically and create good code from these abstractions. I personally think computer literacy/competency is much more important than the ability to code. Unfortunately we still aren't to the point where we can say that most of the population is there. ------ peterwwillis I'd say coding is really a lot like learning a martial art. You get exercise (brain/body), you can use it to overcome difficulties (formatting complex documents/leg-sweeping a belligerent drunk), and you can develop it into a career if you focus. But really it's just a set of tools that have a wide ranging use. Of course, you also have to practice so you don't get rusty. I think everyone who has a mind for problem-solving should learn to code. Heck, even people who aren't especially right-brained can use code to be creative and artistic. And sometimes it's just fun, damnit. ------ orbitingpluto Learning some VBA counts as learning to code in my opinion. I've temped before and been told to do nothing more than prettify and unify formatting in an Excel spreadsheet (for later insertion into a db). (555)555-1212, 555.555.1212, (555) 555-1212, 555-555-1212, and so on to some standard format is a 10 minute job regardless of size. Not everyone needs to be able to really program, but a healthy respect and knowledge of what a couple of one liners can do saves lots of money. That's money that could be spent on truly productive tasks that would actually drive the economy and not burden it. ------ docgnome I think I just don't understand a growing population on HN. Since when is programming about building websites? Or desktop apps? Or making money? What happened to ars gratia artis? ~~~ ckolderup This is the boat I'm in. Everyone seems to be applying this exclusively to BUSINESS and DAY JOBS and MISSION-CRITICAL APPLICATIONS and CLIENTS and CUSTOMERS. Why not empower people to spend their leisure time or their artistic pursuits in a new, challenging way? ------ justinj the implication that learning to code results in 10,000 viewers? please. it wasn't your coding skills that brought in those UVs. by your logic, all of the sites created by shit-hot coders like us should hit 10,000 views in the first two hours easy. you're propagating the kinds of myths that i debunk daily with friends and family, who grossly underestimate the difficulty of attracting eyeballs to their content (and whose motivation sadly erodes when said eyeballs don't instantly materialise). if people have something to contribute, there are so many tools and services out there that negate the need for coding it isn't funny. i always advocate non-coders to test out their market theories using tumblr, wordpress, facebook, twitter, posterous, pinterest, et al before rushing off to pay someone to develop something. (in fact, i think us developers should bill ourselves for our own time on our own projects, but that's fodder for another post). should the average person understand tech? definitely. understand the web? for sure. hypertext markup? why not. javascript? yea, you can skip that. jeff atwood, you are right on the money. ------ AndrewWorsnop I think learning how computers/software works is just as useful as learning history or geography or science, which we already make students learn. ------ bryze I have to agree with Jeff Atwood's original post, here. The complexity of modern society is requiring individuals to be more specialized. If anything we need improved communication so that inter-disciplinary projects proceed efficiently. As much as we exalt the individual here in the states, a team almost always surpasses what an individual can accomplish. ------ 27182818284 In the Information Age, literacy isn't defined as being just literate, it means being computer literate. All of these people have it wrong by saying people need to learn coded to like build a website or start a startup, they need to learn basics so that they aren't overwhelmed when they need to export something to CSV from Excel and shutdown. ------ guscost Here's my take: C is a precise language, in a certain sense, whereas English is not. If I can successfully communicate information to another person using C, that person can put a lot of trust in their understanding of my idea. Precise languages are very useful, and everyone should learn that they exist, if nothing else. ------ m3rv When this crap about "learn to code" will stop? Most of people don't know how to code his TV set. Why? They don't care!!! You won't sell them Your programming course!! You won't sell them Your ebook... You're just making mess. \--> To all, who want to sell something, with this "learn-to-code" propaganda... ------ diego _"knowing to code is hugely empowering."_ That is not a good reason to learn to code. Being beautiful and rich is empowering. Being tall and strong is empowering. So what? A large number of people cannot be beautiful, rich, tall, strong, or good enough coders to feel empowered. On a completely unrelated note, everybody should learn CPR. ------ acoyfellow I love your mind Sacha. You are brilliant in my eyes, but I don't think the Mayor of NYC needs to code. ------ alexchamberlain Dear Author, are you a mediocre coder? Edit: This wasn't a derogatory comment. I merely wanted to point out that the Author probably isn't a mediocre coder and those people that are probably can't touch 10000 people in just 2 days. I really like thetoolbox.cc. ~~~ sgdesign I would probably call myself a mediocre coder compared to the average HN'er. But I don't think you necessarily need great coding skills to reach people. The Toolbox really is nothing more than a customized WordPress theme, and I'm sure I could find lots of similarly successful sites that are not overly complex from a technical point of view. ------ sasha-dv A slightly off-topic question: How much can you learn about computers by doing the code year? Isn't this (<http://www.codecademy.com/tracks/code-year>) just a JavaScript course? (not being a jerk, just curious) ------ tedmiston I agree with the thrill of being able to impact millions of people in some substantial way. For some people, that's through code; for others, the medium is paint, magazine articles, blog posts, books, posters, music, ... ------ jheriko I've found coding useful in every non-coding job I've had where I have sat in front of a computer in an office... Office jobs, and general running of any business can be improved with knowledge of code - or code itself imo. ------ atirip "I think everybody should learn to code, for one simple reason: knowing how to code is hugely empowering." Oh please, before lecturing what everybody should do - first learn some empathy, that will blow your socks off! ------ T-zex Please do not compare coding to cooking. ~~~ nicholassmith Please justify _why_ you think it's a bad comparison? I think it's apt. Cooking shares many qualities with coding, you're under time deadlines to produce something for a end consumer (for the most part). There's lots of competing methods and you have to select which ones work best for the product you want to produce. Most importantly you can cook something amazing that everyone loves, or a charred lump that no one will touch. Much the same as coding. ~~~ T-zex Cooking is way more simple compared to programming. The recipes are easily understood - a complete opposite to software specs. Cooking does not evolve that fast compared to programming. Everybody loves their grandma's pie. And most important cooking is manufacturing and programming is designing. Regular cooking is trying to reproduce something that was done as close as possible, while programming involves dealing with lots of specifics. ~~~ nicholassmith Really? I think the molecular gastronomy (<https://en.wikipedia.org/wiki/Molecular_gastronomy>) chefs would disagree with you about that. The argument still holds, programming can be ridiculously simplistic and so can cooking. ------ davtbaum Anyone else bothered by the crappy coding style in that screenshot? ~~~ sgdesign What's so crappy about it? ~~~ sasha-dv No comment on coding style, but you may want to look into this: <http://imgur.com/sE7ul> Horizontal scrolling - a possible bug? Screen resolution: 1024×768. UA: Mozilla/5.0 (X11; Linux i686; rv:12.0) Gecko/20100101 Firefox/12.0 ------ crazy_eye Please... read the original article. ------ berntb Some free advice to the kids (and then _get off my lawn!_ )... Remember that Buffet quote -- he has seen a few bubbles: "... try to be fearful when others are greedy and greedy when others are fearful." Edit: If it isn't obvious -- the meme "everyone should learn to code" is the largest top-of-a-bubble signal I've ever seen. ------ georgieporgie Skilled programmers are already horrible at effort and time estimates. The last thing I need is a client who "programmed" some barely-functional shell and who is certain he/she only needs me to "finish up" some of the details.
{ "pile_set_name": "HackerNews" }
Update on Oracle Layoffs - jen20 https://www.thelayoff.com/t/KTCW4qz ====== lkrubner Is Oracle doing well? Ten years ago I thought they were going to suffer as Open Source projects got better and better. That has not happened, or at least, that didn't happen nearly as fast as I thought it would. Oracle faces intense competition from Microsoft, and also from Open Source. The fact that Oracle has continued forward, seemingly unharmed, has so far struck me as something like a magic act. Are these layoffs a sign of Oracle in long term decay? Or is this merely typical corporate reorganization that any large company goes through every few years? ~~~ raesene6 For me Oracle has gone into the arena of companies like IBM and CA, where they pretty much only deal in enterprise IT scenarios and have no interest in small/end-user computing. Those companies can continue to make money on support and upgrades for a really long time as the pace of IT change in core enterprise systems is glacial. Personally I think this kind of high-end only strategy is a really bad idea, as it cuts off your supply of new people who know about your technology and are developing new software for your platforms. I always thought it was a big reason why Linux did so well against proprietary Unix systems like HPUX, Solaris, AIX etc, and in turn why they did well against even more closed-off systems like mainframe products. When it's hard for people to learn to use your product, due to expensive hardware and licensing fees and complex install processes, fewer people do it, which directly leads to increased costs to hire people in that arena. Compare that to Windows/Linux admins who can easily learn their craft on a cheap/free software and cheap hardware. Microsoft is a good example of a company who generally gets that problem despite playing in the enterprise space. You can get 180-day licenses and VM images for most of their products for free (completely fine for training), and there's a load of cheap/free training materials. ~~~ emmelaich > Oracle has gone into the arena of companies like IBM and CA ... Exactly: [http://www.dbms2.com/2015/12/31/oracle-as-the-new-ibm- has-a-...](http://www.dbms2.com/2015/12/31/oracle-as-the-new-ibm-has-a-long- decline-started/) Personal story - they're pushing exadata and exalytics. Products based on 10 year products based on 15+ year old ideas. ------ WhiteSource1 Looks like lots of trouble in the whole ERP space. SAP isn't seeing people moving to HANA. Panaya is struggling. And Oracle EBS market share is declining. ~~~ orly_bookz I don't know if all ERP software is like what I've seen but what I've seen is a hot mess. If your company is ever looking to move to the ERP space from a homegrown legacy system like we did, don't buy Infor. I'd rather deal with greenscreens for the next twenty years of my life than have to deal with Infor products and support for one more year. And I used to _hate_ old, tired Big Iron. ~~~ WhiteSource1 Don't know anything about Infor but these are huge complex systems that people don't want to change because there is a huge risk of changes. So now they are falling behind, running up huge amounts of technical debt, and staying on old systems. I know with SAP most people have never heard of HANA and most of the companies that migrated did so because they were on very very old versions. Oracle EBS is pushing this migration to the cloud that is making a bit of traction with SMBs but not going anywhere in the enterprise and changes are difficult. ------ rwmj Can someone explain why Oracle bought Sun and then screwed it all up? Was it a deliberate plan? Did they make money from it - and how? ~~~ godmodus they capitalized on Solaris (read SunOS) for many years, making sure the online documentation is obtuse and the best option for companies is to buy pricey support contracts. or send their DevOps to training courses, at a premium. They really made sure their online community/footprint was kept to a minimum - googlging issues and errors often led nowhere and left no choice but to call upon the support gods who'd login and fix things. the support folk were nice people - but it left you non the wiser about your issue. They also are capitalizing on Java using premium licensing if you're building with Oracle Java for the enterprise and planning to make money from it. my source is my own experience working with Solaris, which beyond ZFS and OracleDB, has very few things going for it - it made me feel like a bad Sysadmin, really. maybe i was, maybe it is a better OS than my experience taught me. now they've decided to go all cloud, and are killing Solaris. Java will probably still continue being developed. ~~~ vostok > my source is my own experience working with Solaris, which beyond ZFS and > OracleDB, has very few things going for it At a minimum, I feel like DTrace and Solaris Containers belong in that list too. ~~~ godmodus i'm perhaps a bit biased towards FreeBSD/jails which was a more pleasant to work with. But ill agree, containers and Dtrace are just excellent. ------ Tempest1981 I had almost forgotten about the SPARC CPUs. Looks like they're approaching 4 GHz, using TSMCs fab. ~~~ cmurf And is open, not proprietary and royalty free. ~~~ tossedaway334 while there is an openSPARC core (that is around 10 years old), SPARC in general is not "open, not proprietary and royalty free." ~~~ cmurf [https://en.wikipedia.org/wiki/SPARC](https://en.wikipedia.org/wiki/SPARC) Open Yes, and royalty free ------ winteriscoming >> Still Unclear: The extent of near-term terminations. The fifty percent figure actually seems a bit high to us at present Is this 50% of Oracle, that's being speculated, or just some specific division within Oracle? 50% of Oracle, does seem very high. ~~~ dlgeek Based solely on the article, it appears to be Oracle's Systems division which seems to own SPARC and Solaris - not sure what else without googling. ------ SQL2219 [https://www.indeed.com/jobtrends/q-Solaris.html](https://www.indeed.com/jobtrends/q-Solaris.html) ~~~ Annatar They are all dismally small, including Linux at ~1.4%: [https://www.indeed.com/jobtrends/q-docker-q-kubernetes-q- sma...](https://www.indeed.com/jobtrends/q-docker-q-kubernetes-q-smartos-q- solaris-q-linux-q-ubuntu-q-delphix.html) ------ SQL2219 [https://www.indeed.com/jobtrends/q-oracle.html](https://www.indeed.com/jobtrends/q-oracle.html) ------ xer0x Is this a joke? I didn't realize Oracle continued working on all these SUN products.
{ "pile_set_name": "HackerNews" }
John Carmack on Functional Programming (2013) [video] - tosh https://www.youtube.com/watch?v=1PhArSujR_A ====== JohnCarmack I am aware that my presentations aren't optimal for communicating targeted information, and it does weigh on me more and more as the years go by. So far, I haven't been able to justify to myself the time required to do a really professional job, so I just show up and talk for a few hours. I like to think there is some value in the spontaneity and unscripted nature, but I don't kid myself about it being the most effective way to communicate important information. I'm taking some baby steps -- I at least made a rough outline to guide my talking at last year's Oculus Connect instead of being in full ramble mode. ~~~ dgritsko While this may be true, please don't let it cause you to shy away from "full ramble mode" when the opportunity presents itself! I know I speak for many when I say that I have learned much from hearing these sorts of talks of yours over the years. Your willingness to share your wealth of experience is inspirational, regardless of the format. ~~~ bluejellybean Agreed, full ramble isn't something many people do well and it's fun too watch. Random but useful information will fall out of peoples brain and it's great! ------ cthulhujr I really like John's perspective on things, he can take a listener or reader from the most abstract concepts to the nitty-gritty without losing focus. He's a tremendous asset to the programming world. ~~~ ksk I agree with the second part of your sentence. I have watched every single one of his talks, and while I find them entertaining, I don't think his brain-dump style of communication is appropriate for teaching/instruction (to clarify: I use his talks as a springboard for my own discovery, not as a terminal point to aggregate knowledge). ~~~ waivek [https://youtu.be/lHLpKzUxjGk](https://youtu.be/lHLpKzUxjGk) I disagree. See the above link for a lecture where he describes the difficulties in VR in a manner that anybody with minimal programming experience can understand. ------ pmarreck He wrote a very good piece here: [http://www.gamasutra.com/view/news/169296/Indepth_Functional...](http://www.gamasutra.com/view/news/169296/Indepth_Functional_programming_in_C.php) on the same topic (is it in fact the same?) ~~~ seanalltogether I come back to this quote over and over when talking about desktop and mobile apps/games, especially since they tend to be highly state driven view collections and devs are always trying to come up with DRY patterns to bury important features in subclasses or helper utils. > A large fraction of the flaws in software development are due to programmers > not fully understanding all the possible states their code may execute in. ~~~ pmarreck This is also simultaneously a strong argument for unit tests (which encode knowledge of that state into a proof of sorts). The limit of a programmer is his/her brain's ability to contain all these possible states. Bugs always come from missing some mental modeling of state or having a flawed conception of it, either at the point of design, the version 1, or the rewrite. OO just buries that state "elsewhere", so things _seem_ easier superficially to contain mentally (but the state is still there, ready to get corrupted and pounce on you). FP makes the state explicit, so you're forced to deal with it at all times (this perhaps not uncoincidentally also makes FP easier to unit- test AND reason about). If managing that state upfront becomes unwieldy, then that becomes a good code smell/indicator that your design is suboptimal. The upshot of all this is that _I think_ that using FP in conjunction with unit-testing reduces bug production by some statistically-significant amount, especially as a codebase grows. We definitely need more empirical data about this, though. But that's what my intuition says. ~~~ grogenaut one of the hard things about game dev and testing is that there are all sorts of bits of code that don't really have strong "success criteria". A lot of it comes down to "did that feel good" or "does that look better" which is very hard to quantify in unit tests. However I think there's also a huge swath of code in games that can be unit tested. Scorekeeping systems, ai evaluation trees, etc. The industry as a whole has much more of an integration or manual testing bent focus than a unit testing one. I will say that coming from a no tester all dev unit / integration test world to no automated test, 30-120 high skill manual tester world (30 gameplay testers, every other employee using new builds 2x daily), there is definitely something to having lots of really good manual testers with very short feedback loops. People noticed bugs that were hard to test for within 10-15 minutes of checkin on a regular basis. Really if I did a studio I'd have both but then I'd likely be spending too much money and go out of business. ~~~ lallysingh Why would unit testing cost extra? It saves money. ~~~ lgas Like most technical work, it saves money when done well and costs money when done poorly. Most people find themselves somewhere in the middle most of the time so it turns out the discussion is a little more nuanced than that. ~~~ grogenaut Exactly! But it's so much easier to be an unyielding zealot than to think and understand why and if something is working. As I said above I'm a big advocate of tdd... Do it all the time. So when I say I wouldn't do it for parts of games maybe I have a reason (to gp not you) ------ jandrese Interesting that he ran into friction exactly where you would expect him to. He was talking about having the AI scripts running in different purely functional threads. Later (around 21 minutes in) he mentions what happens when two AIs decide to move to the same place at the same time, and has not figured out the solution. Of course parallel programming is easy if you ignore the data dependencies, but eventually you run into unavoidable dependencies and your clean elegant system has to pick up some smell. ~~~ ece He talked about a whole FP-powered architecture to solve this. ~~~ sololipsist Motivated reasoning. He prefers imperative programming (or something other than functional programming), so he hears the "criticism" without hearing the solution. I mean, of course he physically hears both, but one is prominent in his consciousness. It doesn't matter that there is a solution, he heard what he needed to hear to get the evidence he needs to maintain his bias, and shuts off his frontal lobe. To be clear: Everyone does this. I just think it's interesting. ~~~ scott_s I think you misunderstand Carmack's position. As of this talk, he had not yet solved this problem, but he is optimistic that it can be solved, and ece mentioned the solutions he outlines. He goes on to say such a system would be beneficial to all game developers. ~~~ sololipsist He says it's a problem that can be tackled, it just hasn't been yet. Like you look at a square block, and you need to get it in a square hole, you just haven't gotten to it. It's not the language of someone that isn't confident that there isn't a perfectly fine solution. I mean, it's likely that other people have already figured this out, just not him. ------ babuskov It actually starts here: [https://youtu.be/1PhArSujR_A?t=2m7s](https://youtu.be/1PhArSujR_A?t=2m7s) ------ ballpark Should add (2013) and [video] to the title ------ zerr I guess we all had that FP click at some point in our lives, but then we got back to our regular (imperative) flow, hopefully as better engineers :) ~~~ yogthos I got that FP click, switch to Clojure, and pretty happy with that move 7 years later. :) ------ ece I wonder about the present state of the functional projects he talks about and is currently working on. A bit old, but good video talking about FP and static typing from a veteran perspective. Would like to see a conversation between Sid Meier and Carmack, the modern Civ engines seem to have made some strides in stability methinks. ------ vog While he says a lot of interesting things, it is too bad he is really just sitting and talking. No slides, gestures or any other facilities. Those would have brought more structure into the talk, making it easier to follow - especially for non-native speakers. ~~~ epigramx Or more distracting and annoying. ------ zengid The thing that's interesting about Haskell is that lazyness had a payoff of leading to the discovery of monad-based IO [1]. [1] [https://youtu.be/re96UgMk6GQ?t=31m22s](https://youtu.be/re96UgMk6GQ?t=31m22s) ------ bmc7505 Has there been any progress on garbage collection since then? He discusses GC in games starting here: [https://youtu.be/1PhArSujR_A?t=1443](https://youtu.be/1PhArSujR_A?t=1443) ------ keymone this is a best video to explain to your C++ friend why functional programming is worth it even in gamedev world. ~~~ pjmlp C++ has been getting functional programming goodies since C++11. Nowadays with C++17, functional programming in C++ is a common talk subject at C++ conferences. ~~~ keymone that's nice, but generally speaking the mindset is still very much imperative. ~~~ pjmlp Well, it is still a huge fight to get many devs to stop writing "C with C++ compiler" style anyway. The C++ community that cares about C++Now, CppCon, ACCU kind of material, does care about applying functional programming ideas to their daily coding. ~~~ f00_ is it bad that I prefer C to all the shit in C++? Now I don't even write C/C++ regularly, so my opinion is probably shit. But I really like the simplicity of C over all the features of C++. ~~~ pjmlp What is so simple about more than 200 documented cases of UB and all the ways one can write unsafe code without even knowing about it? C++ has stronger type safety, offers the libraries and language features to only go down to C like unsafe coding as last resort. ------ vog EDIT: Why all those downvotes? Are famous developers not supposed to criticized on HN? While John has a lot of interesting things to say, the presentation is awful, almost an imposition to the audience. There's not a single slide, or any repetition to clarify structure, or any notable gestures to make up for that. A simple overview, just a damn simple list of keywords, would already go a long way. That would add a lot of structure and would make the talk so much easier to follow, especially for non-native speakers. Just because one is so much respected by the audience that they will tolerate everything, one should not act like the audience will tolerate everything. ~~~ Kiro I downvoted you because: 1\. You state your criticism like it's an objective fact. I think it was a brilliant talk. 2\. Your edit. It violates the HN guidelines and you insinuate people only downvote you because they are Carmack fans. ~~~ optimusclimb > You state your criticism like it's an objective fact, when in reality I'm > sure most would disagree with you. He stated his opinion, it's how people discuss things. Ironically, by saying "in reality, I'm sure most would disagree with you", you do the same thing (express your opinion as if it is fact), AND use the weasel words of "in reality" to add gravitas to your opinion. I didn't like the talk either FWIW.
{ "pile_set_name": "HackerNews" }
Mutt email client 25 years old - job http://mutt.org ====== schoen My stepmother's cousin, Jean-Pierre Radley, was a Unix enthusiast and consultant in New York City (mainly an early SCO adopter, who dabbled in Linux later on). He taught me to use mutt in 1997 or so -- apparently not long after it came out! I'm still using mutt, although Jean-Pierre passed away three years ago. [https://www.legacy.com/obituaries/nytimes/obituary.aspx?n=je...](https://www.legacy.com/obituaries/nytimes/obituary.aspx?n=jean- pierre-radley&pid=187551978) I believe that at the time he started using mutt, he was still getting some of his e-mail over the declining UUCP network (as he was a tremendous UUCP enthusiast and even provided commercial UUCP connectivity and support at one point). ------ gorgoiler Shout out to _davmail_ , an IMAP proxy that sits between mutt (or any other IMAP client) and Outlook365. Davmail handles Outlook’s _Modern Authentication_ , and launches a browser when a 2FA challenge/response is required. The latest version of davmail can cache authentication keys meaning you only have to go through 2FA once. It’s been a real joy to return to mutt, in my latest job where Outlook is deployed, after years of using Gmail. [http://davmail.sourceforge.net/](http://davmail.sourceforge.net/) ~~~ nullwarp Thank you for this - i know what I'm setting up Monday. ------ zadwang I use mutt, mbsync, maildrop, and mblaze, mairix, msmtp, oauth2 with gmail. I have local speed, two way sync, customized email filtering, fast searching, vim editing, and multiple machine freedom. I am happy. ~~~ res0nat0r I would love absolutely Mutt if it weren't for everyone in the world using html email. I know you can view or strip html emails down to their essentials, but it doesn't render properly enough where I can reply to everything at work and not look like a fool because I'm missing some context. Too bad because Mutt is amazing. ~~~ feanaro You can just set up Mutt to open HTML mail in Firefox. ~~~ dancek A great example of the HN use of the word "just". ~~~ feanaro I'm not sure what you are implying. That it is hard? It is only a single line added to the `mailcap` file: text/html; firefox '%s' &; test=test -n "$DISPLAY"; needsterminal; After that's in place, you simply open attachments (via `view-attachments`) and call `view-attach` (usually via a key binding) on the HTML attachment. ~~~ bravura You just used the word “simply” in the same was as you used the word “just”. The argument GP was making is that what is obvious to one reader may not be obvious to another. So using the words “simply” or “just” or “obviously” does not add information, except to signal that you feel the reader is ignorant if they are not aware of what you’re explaining. My PhD advisor always crossed these words out of my scientific writing, and I think it was a good change to make. ~~~ sameerds Strangely, that's entirely the opposite of what I understood from the original comment. To me, "you can just use XXX" sounds like someone just told me that "all you need is XXX; don't worry it's simple". The assumption is that the HN'er saying it and the HN'er being said to share an implicit level of expertise since we are all talking about mutt here. ~~~ feanaro That's exactly how I meant it, FWIW. ------ ninjin Thank you Mutt (and also NeoMutt for pushing the envelope), you have been my e-mail client now for the last four years and as an academic that spends a significant amount of time reading and responding to e-mail it is (somehow?) the best option out there. Setup: * Mutt (I had slowdowns with NeoMutt) * Vim with four e-mail specific lines in the `vimrc` * fdm for retrieval and delivery rules * Syncthing for synchronisation between machines (it is just files! although ~1,000,000 of them…) * tmux to give me a single horizontal split so that I can browse and compose at the same time * Notmuch for search * Lynx to beat text/html into shape * a tiny snooze shell script of my own coupled with an equally tiny unsnooze shell script that runs every few minutes on a box that is on 24/7 That is it, although I have to admit that I should clean up my `muttrc` at this point as it is an outright mess. There are always more tweaks one could perform, my next one probably being figuring out tagging and then sending multiple mails to the snooze script. But one has to exercise a bit of self restraint or get less rather than more work done due to “over tweaking”. Gripes? Well, the configuration is very arcane at times and monolithic; you wish you had a more modern, scriptable, and modular interface. If so, you could probably cut down the time it takes to get something working by more than half. I also suffer occasional CPU spikes, perhaps due to some weird interaction with Syncthing when both monitor directories. Other than that, it is very smooth and pleasant sailing. HTML e-mails in particular hardly if ever pose a challenge after I got Lynx into the mix. ~~~ ohlookabird > tmux to give me a single horizontal split so that I can browse and compose > at the same time Do you mind sharing how you do this? Browsing plus composing has been the one thing that keeps annoying me for not being streamlined and typically I just open multiple mutts/neomutts. ~~~ ninjin I will not be much help I am afraid: `mutt` for the upper and `mutt -R` for the lower. ~~~ ohlookabird Oh okay, so that's basically what I do manually anyway then. Thanks. ~~~ dm319 Me too, but sometimes no need to make things more complicated. It's also a feature of mutt - that I can fire several instances almost instantaneously and use them for different purposes. I have to use outlook at work, and there's nothing more frustrating than waiting for it to load up, and not being able to view and search my old emails when I am composing emails (or is it when I'm selecting contacts - I can't remember). ------ combatentropy I sometimes daydream that everyone at work uses Linux in general and a text- based email program in particular. It isn't too farfetched. When email took off, I was in college, and we all used Pine. Anyway, if you email me, it will arrive on my virtual private server, and I will read it with Mutt. ~~~ debaserab2 I’ve never been more inclined than now to do the same thing. How do you deal with spam? How do you ensure your emails don’t get junk mailed? ~~~ acidburnNSA Not parent, but I'm in the same boat with the self-hosted email VPS. I usually use Thunderbird as a client though, but sometimes mutt. \- SpamAssassin does a wonderful job after training it about once every 3 years. I get almost no spam. \- I was able to send to most people from my VPS, but not Charter. Charter blanket-blocked my ip block. So I ended up setting up SMTP forwarding via sendgrid free tier (100 emails per day). Now I always get through. ~~~ watchdogtimer I operate my own server and SMTP forward using Mailjet and have had zero problems as well. ------ middleclick I can't imagine doing email without mutt, without all the keyboard shortcuts I have, and without vim as the editor. Thank you mutt. Just what an email client should be. I wish there was better search support but that's about it. ~~~ codebook i recently switched to Thunderbird from Neomutt + Notmuch + afew + gmailieer combination. I was satisfied mutt's responsiveness, simplicity. But more and more emails are only for html based and its conversion to text is horrendous, I had to view Html at its own format. Then Thunderbird becomes a good candidate. ~~~ yjftsjthsd-h Similarly to sibling comment, I just set in mailcap: text/html; elinks -dump %s; nametemplate=%s.html; copiousoutput ~~~ codetrotter The really annoying thing is that some mail is multipart where the text/plain is just “sorry, your email client does not support html”. Yeah thanks a lot but it does, it’s just I have it set to prefer text/plain, because I don’t want to look at the html dump version unless I have to. If those people had simply sent _only_ text/html and not a useless text/plain that only says that kind of stuff then everything would have worked fine. Stuff like that makes me want to quit using mutt, and no fault of mutt mind you. But laziness has kept me to using mutt for reading my self hosted mail for many years now. ~~~ jjav That's mildly annoying indeed, but I just press v and select the html version in those cases. As you say, not mutt's fault. ~~~ dm319 Yes, and this is still usually quicker than firing up a full-fat email client (as I often have a web browser open anyway). ------ samcheng I remember the mind-blowing switch from pine to mutt. Then, a few years later, gmail, which has to be about 20 years old or so. I haven't used much mutt since. ~~~ jjav gmail is such a huge regression from mutt (or even pine or elm). I rank the email experience of gmail on par with /bin/mail. It's that useless. I use mutt as an IMAP client for gmail when I can, that works mostly ok. At current $DAYJOB they only allow gmail web client which is a complete disaster of unusability. So I mostly just zoom or slack people since gmail is so useless. gmail is email written by those who have never used a proper email client and don't really care about email anyway. (gmail came out in 2004 - got the tshirt from the launch.) ~~~ dm319 As the search experts, I'm always surprised how poor and unreliable search in gmail is. I also use mutt with my gmail account via mbsync. At least you get to use gmail webmail. I have to use Outlook at work. ~~~ rawoke083600 Lol side-story... I once worked for Big-Multi-National i.t/startup company they even have a stake in Tencent. Their official mail policy was after 3 months we delete all your email.(You had to save/archive) it yourself if you wanted to have access to it. Some MSExchange setup. The 'reason' back then was 'Some semi-high-level-executive lost his tablet somewhere with important mail. I couldn't believe a company this big and enough money to buy common sense and intelligence came up with this 3-month rule as a solution to 'lost executive tablet'. They closed allot of their local I.T/Startups in the last decade in South Africa. ~~~ dm319 Jeez, what an awful solution. I work in an academic environment and also in a healthcare environment. There is a national 'secure' MSexchange service for patient data, and we also use a local exchange server for within-hospital communication. I have no idea how secure any/all of this is, but I tried to add my 'national' account onto my regular outlook, and it broke everything. IT support told me that I wasn't allowed to do this because it wouldn't be secure. Which confused me given I used Outlook for confidential communication within the healthcare provider. So I'm forced to use their webmail client which is even worse than Outlook. Not sure how email got into this mess. It should be simpler, and confidential communication should just use something different IMO. ------ dang If curious see also 2017 [https://news.ycombinator.com/item?id=14567074](https://news.ycombinator.com/item?id=14567074) 2015 [https://news.ycombinator.com/item?id=10182582](https://news.ycombinator.com/item?id=10182582) ------ kashyapc Another extremely happy user of the following combination waves hi: • Mutt • OfflineIMAP (to be replaced with `mbsync`; OfflineIMAP is not being ported to Python 3, unfortunately) • Notmuch — for fast indexing, searching, and tagging e-mails • Postfix — Mail Transfer Agent; offline queuing. Overall, it's one of the most robust pieces of software; Postfix never failed even _once_ on me in nearly six years. ~~~ dm319 Same here, though I moved to mbsync a few years ago and I need to check out postfix. It is such a great email client that I feel lost and walking through treacle using anything else. I don't work in the tech field, but I have a lot of emails to deal with. People come to me now to ask me if I can find specific important emails from a while ago that they are after. Outlook is hilariously bad at searching. I wish I could use mutt instead of outlook for my work. EDIT: It looks like I'm using msmtp instead of postfix - any advantages to switching over to postfix? ~~~ kashyapc If 'msmtp' is working for your use case, just don't fix it. :-) (Although, someone here might be able give a comparative rundown.) I just deal with a _lot_ of email volume - large portion of it public mailing lists; Postfix has been a trusted comrade. ------ stephenhuey Noooooo WAY. I started at Rice in '98 and remember using Pine on the Sun Solaris desktops spread all over campus. Of course I imagined Pine must be ancient software, and when someone told me to switch to Mutt halfway through college, I assumed it was just yet another super ancient program, not realizing it had been released when I was in high school! ~~~ pessimizer This has definitely been added to my list of things that I thought were far older than they were. ------ mbreese It's probably time for my annual attempt to convert over to using a mutt-based email workflow. I've always been happy with the mutt setup, but I think the majority of my problem is dealing with importing my existing Gmail and (work) Office 365 accounts. Sometimes I've synced the data to my laptop (last time I even had it all running in a container). But I still need to access enough of my email from my phone that I've found the mutt setup too cumbersome, even when using IMAP. Does anyone have a good setup with multiple accounts or using both mutt with a mobile email client (iOS)? ~~~ RamenDevourer mutt-wizard makes it very easy to setup mutt/neomutt to make it work with gmail and other services. It sets up encrypted offline versions of your mail and sets up most of the mutt environment for you: [https://github.com/LukeSmithxyz/mutt- wizard](https://github.com/LukeSmithxyz/mutt-wizard) ------ IgorPartola One of the major reasons I use Gmail in the browser is because it’s in the browser: it’s always on the same tab and I can switch to it when I want to check it. I used Mutt (unsuccessfully) a few years ago since my other window that’s always open is a terminal emulator. I wonder: should I fire up a Linux emulation in my browser and run Mutt inside that? I’m sure things have gotten fast enough that this would actually work. ~~~ awesome_dude I use mutt for Gmail in my terminal, the setup is fairly easy and documented across blog posts example muttrc ========================================================= set imap_user = '<username>@gmail.com' set imap_authenticators="oauthbearer" set imap_oauth_refresh_command="~/.mutt/oauth2.py --quiet --user=<username>@gmail.com --client_id=<id from google>.apps.googleusercontent.com --client_secret=<secret from google> \--refresh_token=<token from, you guessed it, google>" set smtp_authenticators="oauthbearer" set smtp_oauth_refresh_command="~/.mutt/oauth2.py --quiet --user=<username>@gmail.com --client_id=<id from google>.apps.googleusercontent.com --client_secret=<secret from google> \--refresh_token=<token from google>" set ssl_starttls=yes set ssl_force_tls=yes set imap_pass = 'my really hard to guess gmail password' set from='<username>@gmail.com' set realname='Fancy pants name' set folder = imaps://imap.gmail.com/ set spoolfile = +INBOX set record = "+[Gmail]/Sent Mail" set postponed = "+[Gmail]/Drafts" set trash = "+[Gmail]/Trash" set postponed="imaps://imap.gmail.com/[Gmail]/Drafts" set header_cache = "~/.mutt/cache/headers" set message_cachedir = "~/.mutt/cache/bodies" set certificate_file = "~/.mutt/certificates" set smtp_url = 'smtp://<username>@gmail.com:<my really hard to guess password>@smtp.gmail.com:587/' set smtp_pass = 'my really hard to guess password' set move = no set imap_keepalive = 900 #refresh every 10 seconds set timeout=10 set mail_check=20 # allow mutt to open new imap connection automatically unset imap_passive # vim! set editor=vim # email sorting set sort=threads set sort_browser=reverse-date set sort_aux=last-date-received # handy macros macro index gd "<change-folder>$postponed<enter>" "go to drafts" macro index gs "<change-folder>$record<enter>" "go to sent" macro index gi "<change-folder>$spoolfile<Enter>" "go to inbox" macro index gt "<change-folder>$trash<enter>" "go to trash""" macro index tb s+[Gmail]/Trash set noconfirmappend set quit=ask-yes ========================================================= ~~~ tambourine_man I still use POP cause I’m paranoid. Does imap guarantees that it downloads and stores all mail indefinitely? ~~~ peterwwillis Sure, but the problem of storing messages indefinitely isn't from the protocol, it's from the client. The client may need to verify if it's actually seen a message before and keep a local index of them; it may need to double check the message still is what it expects before it tries to delete it; it needs to present you with options on how to handle message deletion; etc. POP3 is actually _less_ reliable at keeping messages indefinitely than IMAPv4 is, due to limitations of the protocol. Some clients have defaults which simply delete the remote copy after download (or reading, which is different), but again that's not the protocol's fault. ------ steffan For console email, I still use Alpine. Can anyone elaborate on the benefits (if any) of switching to Mutt? ~~~ dsr_ mutt is the best MUA for dealing with large quantities of email. The key to this is the limit/tag workflow. (I wrote it up a few years ago here: [https://blog.randomstring.org/2016/09/26/secrets-of- mutt/](https://blog.randomstring.org/2016/09/26/secrets-of-mutt/) ) In brief, you use the limit command to see only the email you are interested in, then use the tag command to do something to those specific messages. Both limit and tag use the same pattern language to specify From, Subject, date ranges, message numbers, or more esoteric searches. The only thing mutt is not great at is searching multiple mailboxes, and it's easy to integrate an external tool (notmuch, maildir-utils, mairix...) to do that search and link the results into a temporary maildir that mutt will then work on. ~~~ aidenn0 I find bower to be better than mutt at dealing with large quantities of email, but mutt is better at everything else. bower can also run locally with a mailbox on a remote, which allows things like easy seamless opening of attachments that don't work as well in a terminal (images, PDFs, &c.). ~~~ dsr_ Mutt works with 250,000 message maildirs. Does bower? Conveniently? ~~~ aidenn0 My largest maildir only has 80k, but bower works with it quite conveniently. ------ spicymaki Amazing, I feel like I am getting old. I remember using Pine in college and changing over to Mutt. ------ mattbillenstein I still use mutt (with [https://mailinabox.email/](https://mailinabox.email/)) -- and I like how it works, but the annoying thing is html-only emails - finding that unsubscribe link is so hard sometimes. ~~~ wander_homer In such cases I just open the html file in my browser from mutt. Always worked so far. ~~~ codebook It worked to me. But honestly annoying. I had to keep opening the browser for half of the emails. ------ fouc I believe Paul Graham was a mutt user at the time of his spam article[0], and possibly still is. [0] [http://www.paulgraham.com/spam.html](http://www.paulgraham.com/spam.html) ------ toadi I actually stopped using email for private communication. Most emails are just invoices, confirmations or license keys. Gave up on email few years ago. Professional I use email for mostly the same and for official trace communication.... ------ MayaFey I so happened to have spent yesterday afternoon figuring out mu4e on Emacs, which I'm new to. Figured out IMAP but not sending yet. Worked with gmail with a very simple config file. I've tried doing my own mail but it's very hard. ------ upofadown Mutt is the best PGP client bar none. Just make sure you use the GPGME version. ------ jjav Thanks mutt! I've been using it for most of those 25 years. Thank you. ------ em-bee the best times with mutt were the days where we were hanging out with michael elkins (me) at the linux user group meetings in L.A. during which i came up with this joke: "i didn't write mutt, it was me" i used mutt until a few years ago when i switched to sup. ------ ggm I solved a problem with nmh last week. That felt good (finding ten emails by string search from a .mbox format archive and making another mbox archive from them) I am pretty sure mutt would have done it too, I used mh because I had the muscle memory ------ xvilka Would be nice to modernize it by porting to Rust, thus reusing some of the libraries for network protocols, cryptography, file formats. It will allow developers to focus on the important things. ~~~ dsr_ It's open source, so you are totally welcome to do that thing. Enjoy yourself. Please call the resulting program something related to but not identical to mutt, such as rustmutt, rustydog, mutt-ferric-oxide, or whatever else makes you happy.
{ "pile_set_name": "HackerNews" }
Ask HN: Local State School CS major vs. Math/Econ Name Brand Uni? - osiro I&#x27;m a rising college sophomore, who applied for transfer admission to some schools, and got off the waitlist for a Name Brand University (think Michigan, UVa, Rice). My current uni has a solid CS dept (top 25 based on grad rankings), but the school&#x27;s I&#x27;m looking to transfer to have a better overall rep and more recruiters come to these schools. I applied to the arts and sciences colleges for transfer, and at the moment I&#x27;m a Math&#x2F;Econ major. I can try to switch into CS, but I&#x27;ll probably end up with a B.A instead of a B.S. (same classes though). Worst case scenario, I&#x27;d major in Math&#x2F;Econ with a minor in CS, would I still be a strong candidate? I know C&#x2F;C++, Python, Java, R, and Javascript. Also familiar with some open source projects, andhavesome side projects (post on github and all that). Most of the stuff I know is self taught, so I&#x27;m not worried about gaining skills. Will employers care if I have Math&#x2F;Econ degree instead of a CS one? ====== computerjunkie "Will employers care if I have Math/Econ degree instead of a CS one?" <<< Unless you are pursuing a specific area in computer science then its not necessary to do computer science.These days, certain employers don't look for specific degrees...But if you have a mathematically intense degree, employers will be more flexible in selecting you for assessment/interview. A computer science degree is more about understanding computers on a theoretical computational level.Computer science is a "subset" of mathematics.I currently finished my second year in CS and from what I've learned, its all about self initiation and how far you want to "dig deep" in this field. Either way you will gain useful knowledge that can be applied in what ever you want to do.
{ "pile_set_name": "HackerNews" }
A Spacecraft for All: The Journey of the ISEE-3 - tonteldoos http://www.spacecraftforall.com ====== callesgg It is very annoying to see more and more pages that say use chrome to look at this page. However this page seams to work fine in Firefox anyway. And it looked very cool :) ~~~ tonteldoos Reminds me of yesterday's chrome vs firefox debate ;) ~~~ 3rd3 Link? ~~~ tonteldoos [https://news.ycombinator.com/item?id=8151180](https://news.ycombinator.com/item?id=8151180) Be warned - it's a lengthy debate ;) ------ aniijbod Ok, I get it, we're all looking at page design and visualisation issues, which is the kind of thing we do here. But isn't anyone going to just marvel and express wonder and astonishment at the sheer majesty of the extraordinary achievements described in the videos? ------ ajford This was definitely a fun project to be a part of (the reboot, not the chrome experiment). We out here at the Arecibo Observatory are proud to be a part of this moment in history. It was a joy to work with Dennis Wingo and his team, they are definitely cool guys. ------ pronoiac A live webcast is running - [http://spacecraftforall.com/live](http://spacecraftforall.com/live) Edit: Here's the Hangout link: [https://www.youtube.com/watch?v=SdtUIXPjVgk](https://www.youtube.com/watch?v=SdtUIXPjVgk) ------ 3rd3 I find it frustrating that one can only rotate the view within a limited range. Otherwise great work! ------ manish_gill Doesn't work on latest Chrome at all, but works beautifully in my Firefox Nightly. ------ snowwrestler For others wondering how to escape this page on an iPhone, rotate to a vertical orientation to show the address bar, then tap the address bar to bring up the Safari controls. ------ kgabis This page is barely usable and almost fried my mac. Kinda reminds me websites built with flash. Is there any other way to watch that video about ISEE-3? ------ nitrogen Though it says "Chrome Experiment", it seems to work mostly okay in Firefox, but occasionally some things seem to be missing from the view. ------ FrankenPC What an incredibly well done presentation. ------ pepijndevos Why are there loops in the heliocentric orbit? ------ kp25 Great Stuff!!
{ "pile_set_name": "HackerNews" }
Bitcoin exchange halts trades of digital currency after drop in value - xd http://www.guardian.co.uk/technology/2013/apr/11/bitcoin-exchange-halts-trade-value ====== unclebucknasty Seems like a ton of new accounts would represent increased demand, sending the price up, not down. What am I missing?
{ "pile_set_name": "HackerNews" }
Iran in space - user9756 http://therealamirtaheri.blogspot.co.uk/2012/10/keep-your-eyes-in-sky-iran-in-space.html ====== nasir It is interesting for me when reading the opinion of others about Iran. As a person who had lived inside and outside Iran long enough, I feel very sceptical about the content of the article. The author's statements are very similar to those of the government authorities where only certain facts are described regardless of the issues. I'm not an expert about space industry but since I am pretty sure that the article is exaggerating heavily about the advancements and they heavily rely on China and Russia. Also the author claims " _the sanctions have, like many other local industries, pushed Iran to meet its needs locally and therefore advance quicker than possible if Iran had the easy option of importing everything it needed_ ". Apparently he tries to portray that nothing had happened because of the sanctions but it is apparent that the sanctions has crippled the economy currently and the currency value has reduced by five time. The prices has tripled since 4 month ago and middle class society are getting poorer just because of the nuclear ambitions of the leaders. I should also state that I don't believe Iran is making nuclear weapon nor can I deny it. I think western countries would've had the same policy even if no nuclear activity was there and some other excuse will be used in order to protect Israel. Western media have prepared people so well in case they need to take any action against Iran. Well, I hope that does not happen. ~~~ guard-of-terra "The article is exaggerating heavily about the advancements and they heavily rely on China and Russia." I remember how ten years ago you could read the same phrase about Chinese program, and now they are on the providing side already! ~~~ nasir I don't deny they are making progress. But thats not as what you think it is! ------ cstross Author of blog piece seems to be _very_ confused about how staged rockets work, and appears to think that a two stage rocket is somehow inherently more efficient than a three stage rocket! Which is just plain wrong. <http://en.wikipedia.org/wiki/Multistage_rocket> What he might be getting is that, if Iran had stuck a third stage on top of a pre-existing two stage IRBM in order to put a tiny wee satellite into orbit, then it would imply that the rocket in question was borderline-capable and not really amenable to being upgraded further. But the Safir-2 is apparently a 2-stage rocket with the ability to reach orbital velocity, implying that by adding boosters or a third stage to it a much larger payload could be launched. ~~~ mistercow I was thinking that might be what he meant, but then he says this: >While a three-stage rocket is simple and is limited on how much of a weight threshold it can lift, the more advanced two-stage can be expanded for larger weights. The first part of that is, I'm pretty sure, the opposite of true. The second part is sort of true, but the "expansion" in question would be... adding a third stage. I think he heard the Geoffrey Forden quote and misunderstood the point, which is admittedly subtle. I think what Forden meant was: "If they need three stages to do just this, then they'll never be able to use this to launch a human, because a four-stage rocket is completely beyond them. But if they can do _this_ with only two stages, they could get a human to space with three stages." ~~~ cstross Yup, that was my reading, too. ------ daeken I'm so glad to see Iran doing this. Not because I have any particular affinity for Iran, but because the technology will improve as competition increases and more nations/companies take steps into space. This may be a big step for Iran, but it's an even bigger one for the human race as a whole. ~~~ tjmc Are you glad to see North Korea develop its "space program" too? Yes, I know this is HN where discussions about politics are discouraged, but in this case it's bordering on farcical to ignore it. I have a lecturer in fluid mechanics who used to be an engineer on the Indian space program. He interchangeably uses the terms "satellite" and "warhead" when discussing the work he used to do. It may well be that Iran wants to develop orbital launch capability for launching satellites and people into space. But assuming that's the only reason is a little naive. ~~~ eternauta3k I like the occasional reminder that Iran has never started a war. And it won't unless it plans to be obliterated, so don't worry about that. ~~~ redwood Iran is in a war with its own citizens and its own women especially. ~~~ user9756 Not true. Stop lying. Are you eg familiar with the actions taken by the Iranian government which made it possible for people in rural areas to send their daughters to school? Being extremely traditional they were afraid of sending their girls all alone to school. Iran has a very high number of educated women thanks to the policies of the Iranian government. (edit: yes women still have advances to make in rights vis-a-vis men in Iranian society, but you can hardly call that "war" more than a natural struggle for power between the sexes in a society) ~~~ EliRivers "natural struggle for power between the sexes" There's nothing natural about such a struggle. It's cultural. ~~~ user9756 Yeah, I thought about that use of the word ("natural") after I posted, but decided not to edit it out. I agree with you that it could be cultural. But the behaviour of men wanting to control women seems to be prevalent in many societies regardless of culture. ~~~ EliRivers I think it's possible that it's actually just some people wanting to control other people (and setting their sights on weaker people to improve chances of success), and given the biological differences between men and women that leave women with significantly less muscle mass and mass in general, for any given male choosing to act in this way, women are on average easier targets. ------ benjlang Great to hear. We in Israel are also planning on sending a person to the moon soon: <http://www.spaceil.com> Innovation in the Middle East is much needed. ~~~ dchichkov I'm surprised, that you are glad to hear, that a country with relatively unstable government is making progress toward having ballistic missile capabilities. Actually scratch that. It is progress toward ICBMs, not just ballistic missiles. Here's a reference to the current state of the program: [http://iranprimer.usip.org/resource/irans-ballistic- missile-...](http://iranprimer.usip.org/resource/irans-ballistic-missile- program) And a bit more recent update: <http://www.iranwatch.org/wmd/wmd- iranmissileessay.htm> ~~~ benjlang You're right, that is worrisome but I highly doubt it will reach that point. The US and Israel will never allow that to happen. If only the Iranian government cared about its people and making the world a better and more innovative place... ~~~ dchichkov Well, one fact to keep in mind - developing capabilities to launch satellites [payload to orbit] is inseparable from capability to launch ICBMs. That's why first sputnik was both, so inspiring and so scary. As to 'never allowing this to happen'. There are no such guaranties. I'd guess, the opposite is true. Technology [and as a result advanced weaponry] is only getting more and more accessible. ------ dsl Regardless of your views on the sanctions themselves, this blog post illustrates how effective the US blockade of technology to Iran really is. The Iranian space program is just starting to reach the level of sophistication of hobbyists in the western world. The post however fails to mention Iran has actually launched a total of 5 satellites, the first two being joint launches with Russia and China. ~~~ guard-of-terra You are plain wrong. Any space program takes years and starts with satellite launches. Look at the Chinese program for example. There is no reason to think US blockade somehow harmed the progress of the program there. And it does not matter much since USSR is known to launch anything they wanted before the modern globalized economics that made blockades possible even existed. Your comment about hobbysts is plain misleading to. How many hobbysts launch pets into suborbital and orbital flights and get those back in one piece? How many hobbysts can launch a satellite, anyway? Using their own rocket? You can pimp SpaceX to defend your misleading statement, but they are not in any sense hobbysts and their work would not be possible without NASA and it took them long years anyway. So please stop disregarding other people's achivements. This is just lame. ~~~ tomjen3 No but Copenhagen Suborbitals are hobbiests. And they expect to launch a suborbital rocket with a human onboard. And their combined budget is properly less than what NASA use on toilets. ~~~ ryanac Some of those "hobbyists" at Copenhagen Suborbitals are former NASA contractors and they have been working on that project since May 2008, so I think his main point still stands. It also looks like they're still doing a lot of testing on active guidance systems, rather than trying to put a human up any time soon. You can see their plans for the rest of this year under Campaigns > 2012 on their main site: <http://www.copenhagensuborbitals.com> ------ amirex111 The reason that Iran has no nukes is by self choice. They have always wanted the Japan model, to reach the technological threshold of having nukes without physically building them. This they reached years ago. ------ batgaijin If Iran is capable of technological progress at this rate, how did Pakistan get a nuke before Iran? ~~~ sagarun The Chinese gave them one! ------ amirex111 Thanks user9756 for posting this!
{ "pile_set_name": "HackerNews" }
A National Privacy Law Is Nowhere in Sight - tysone https://www.nytimes.com/2019/10/01/technology/national-privacy-law.html ====== client4 In Montana I've been working on privacy legistlation with my friend and State Representative Daniel Zolnikov for almost 10 years now. Initially we tried to pass too much (something like 40 pages in a state where most bills are 1-2), then Dan decided to break out individual ideas from the initial bill and pass smaller chunks or create specific bills targeting abuses. The Montana state constitution provides citizens with a right to privacy but attempting to enumerate it in law is an uphill battle. At the last legislative session the national Charter public policy guy flew in and spoke against the bill saying it would be an incredible burden on his company to allow customers the right to privacy (conveniently ignoring they're up to the task for collecting and storing DNS, usage history, a multi-billion dollar company etc.). They were able to spread enough FUD to get the bill killed. Corporations are stifling privacy at the national level and willing to spend the money quashing attempts by the states to take it back. Privacy will be a hard-fought battle most likely won with technology and user choices vs policy. ~~~ toomuchtodo Do you have any docs or knowledge you could share? I'd be interested in working on this in several midwest states. ~~~ client4 Dan is a great resource to reach out to, he's on Keybase Chat [https://keybase.io/drz](https://keybase.io/drz) and more than happy to help people in the policy realm. Our naive first bill is well organized with some of the things we felt necessary to enable an individuals right to privacy and can be found here [https://leg.mt.gov/bills/2013/billhtml/HB0400.htm](https://leg.mt.gov/bills/2013/billhtml/HB0400.htm) ------ comex Good. With the current partisan makeup of Congress, even if they could agree on a national privacy law, it would be drastically watered down and toothless compared to California’s law, while preempting it (i.e. preventing California and other states from legislating in that area). There’s a reason that it’s industry groups, not consumer groups, who are currently pushing for such a law. Without one, at least California residents will get the benefits of the stricter version, and perhaps other blue states will pass similar laws in coming years. Meanwhile, residents of other states will at least get indirect benefits – for instance, the ability to bring class action lawsuits over data breaches will hopefully encourage companies to invest more in security. ------ brenden2 Unfortunately, I don't think the government is going to come to the rescue anytime soon with regard to privacy. At this point I think the best option is for people passionate about privacy issues to start building products and platforms that address the problems by providing real solutions. That's what I'm doing with all my energy, and I know a lot of other talented people who are too (or, at the very least, they want to help). ~~~ Reedx Maybe if Andrew Yang makes it through we'll see something happen, since his overall approach has a chance of reducing gridlock. And on this issue is proposing data as a property right: [https://www.yang2020.com/policies/data-property- right/](https://www.yang2020.com/policies/data-property-right/) ~~~ danShumway For members of the privacy community that oppose efforts like this, we probably should probably be more careful with the phrasing around "your" data. What a lot of us meant was, "data that is generated by actions you take, or that applies to your life." What (increasingly), people interpret it as meaning is "your data" in the same way that you would say, "your boots." The EU often falls back to referring to this stuff as Personally Identifying Information, which is at least a more logically coherent term -- it focuses more on de-anonymization and the risk to the individual, rather than on this idea that I somehow have a morally granted monopoly over a fact. I'm trying to be more careful about how I phrase stuff like this in the future (PII, or 'facts about you' instead of 'your data'), to try and make it more obvious that the privacy community isn't monolithic and that there is no singular view even among privacy advocates about what data is or how it can be controlled. ~~~ inetknght > _For members of the privacy community that oppose efforts like this, we > probably should probably be more careful with the phrasing around "your" > data. What a lot of us meant was, "data that is generated by actions you > take, or that applies to your life." What (increasingly), people interpret > it as meaning is "your data" in the same way that you would say, "your > boots."_ When I say "your data" I mean exactly "your data" in the same way that I would say "your home" is _your_ home but also the home of people you live with. ~~~ danShumway I'm guessing though that you don't belong to the subset of people who oppose efforts like this? It sounds like you're saying that you do think "your data" describes a type of property ownership, not just an abstract thing you have a stake in protecting. ------ bryanmgreen I believe one of the biggest blockers for ~right now~ is Ajit Pai, FCC Chairman. Anything around privacy will need the support of the FCC and, under the current administration, they have clearly set a standard of favoring a hands- off approach. In fact, avoiding regulation is the number one philosophy bullet point on his profile - [https://www.fcc.gov/about/leadership/ajit- pai](https://www.fcc.gov/about/leadership/ajit-pai). While I am generally and very-broadly-speaking against regulation, I believe privacy and personal-data issues are definitely an exception for me. Edit: Really not trying to be partisan here, folks. Privacy laws are regulatory and it's not an opinion that the FCC doesn't want regulations - that's why I provided the link to the .Gov website for proof. ~~~ Accujack To be fair, this isn't solely a problem from the current administration. Privacy laws in the US have not ever been updated for the computer age. It's still safer to send secret info in a paper letter than in an e-mail, because the laws have never been re-written. This is not surprising given the generation that's been in power since the 1970s. They didn't grow up with computers, and they don't understand them. ~~~ bryanmgreen You're 100% right that this isn't a problem that began or ends with the current administration. There's a lot of work to be done from every party and every age group! I was reflecting on what I what think is a ~current~ large obstacle and tried my best to keep my comment neutral, but thanks for clarifying. ------ dr_dshiv Also, what about a right to cognitive liberty? Privacy is one thing, but does it give me the right to take drugs or implant, because I have a right to my own mind? ~~~ LocalH Sad to see this downvoted, probably due to the mention of drugs tbh. I feel cognitive liberty is underdiscussed. Not just in the context of consumption, but also in the context of mental autonomy. One of these days, if we don't have a strong cognitive liberty protection, they'll end up developing technology to "read" people's minds (which of course will be unreliable at first), and if we're not careful then the idea of having privacy _in our own minds_ will be a thing of the past. If that happens, I fear we'll slip into a dystopia within a quarter century. ~~~ dr_dshiv If that's so clear, how come no one seems to care about cognitive liberty? ~~~ LocalH Probably because they're worried (somewhat legitimately) about more currently pressing issues. I have no hard evidence to back up my assertion. It's just my feeling after observing the path our societies are traveling. I _really_ hope I'm dead wrong on that. ------ journalctl Well, yeah. Functional legislation requires a functional government. This hasn’t been news for a long time. ------ bedhead That would require congress to actually do something. ------ noarchy The US couldn't even stop its own intelligence services from breaking the law with regard to spying on citizens. Do people expect it can or will enforce this on private companies? I'd expect such measures to be equally toothless.
{ "pile_set_name": "HackerNews" }
Internet-induced fear culture (or: Why Girls Around Me isn’t the problem) - evo_9 http://www.extremetech.com/extreme/124531-internet-induced-fear-culture-or-why-girls-around-me-isnt-the-problem ====== jerf Rape is the headline-grabbing problem, but the real problem is a far more mundane loss of privacy. Do you really want _everybody_ in all your social circles to know exactly where you're partying Friday nights, and exactly whose house you actually ended up sleeping at? Or if you have no problem with that, insert your choice of thing you don't really care to broadcast to everybody. A nontrivial part of the reason why I have no interest in being on facebook is the amount of my family that is on it. And I'm not even doing or saying anything that my family will particularly find outrageous like some people, I wouldn't be leaking a sexual preference or something I don't want to reveal to them at this time. I simply don't care to run my every opinion, location, preference, and activity past them, and then have to hear about their opinions about it. To be clear, I'm 33 and have long since stopped _caring_ about their opinions... but I still don't want to listen to them, either. I'm very closely related to some people in my family whom I can't really get away from who are virtually incapable of having a thought without saying it out loud. YMMV. And I am by far _not_ a pathological example... I'm doing this more out of my personal convenience than anything else, there's nothing that I feel I _have_ to hide. What of those that do? And if you're really having a hard time imagining what that sort of thing may be... this very post is an example. The person I'm referring to will _never find it here_. Ever. I can speak freely here, at least as long as I don't name names which I wouldn't anyhow. I'm not in a hurry to live in world where that goes away. Online balkanization isn't all bad. It isn't even mostly bad. ~~~ bradleyland So what is your proposed solution? I already don't post anything online that I don't consider public, just like I don't discuss in a restaurant something I wouldn't want overheard. I have an entirely different privacy threshold than you do. Would any proposed solution respect the fact that I might choose to share more (or less)? For me, the solution seems pretty straight forward. If you desire privacy, don't share. ~~~ jerf "So what is your proposed solution?" My _real_ proposed solution? Wait for Facebook to collapse under its own weight like all previous attempts to massively centralize the web under one roof have failed, then wait for the more decentralized solutions to pop up and potentially join many of them for my various personae, each of which possibly with its own thresholds for sharing and different communities in them. The human instinct that the answer to every problem is centralization is a leftover vestige of our tribal past that is particularly maladaptive in the 21st century, though I'll admit to being impressed how far along that road Facebook has gotten. It's only a matter of time; the contradictions between Facebook's desires and the user's desires are only going to grow. ------ dfxm12 The author claims the Internet is the biggest enemy in this "fear induced culture", due to its ease of spreading FUD. People have been dealing with sensational headlines for a good while now. However, it may or may not be apparent when you buy your first smartphone that it is broadcasting your location along with all of your social networking posts. The _real_ enemies here are insecure defaults & lack of education around how & why to control personal information online. ~~~ bradleyland > The real enemies here are insecure defaults & lack of education around how & > why to control personal information online. The real enemy here is the rapist. Among the measures that a person can take to avoid being the victim of a crime, _not_ checking in on social media sites is pretty low. The fact is that crime is generally very low tech. Cyber- rapists are exceedingly rare. If you're going to opt out because you fear this type of threat, you might as well not leave your underground bunker. ~~~ throwaway64 You are entirely missing the point when you start focusing on "cyber rapists" or girls, or anything like that. The point is that ANYONE can potentially pull and track this information and combine it with other data sources without your permission or knowledge. Months, years, or even decades from now, with increasingly better analysis techniques. That could be a rapist, that could be an abusive government, that could be your ex boyfriend/girlfriend, that could be your employers checking if you were at the bar last night, it could even be your health insurance making sure you went to the gym yesterday, or else they raise your premium. It gets even scarier when you have companies like say credit agencies tracking where you went and who you were closest to, and determining your "risk score" for various things like loans and jobs, etc. The point is there is a massive potential for abuse here that most people have absolutely no idea about. Its simply startling to me that so many here work day in, day out on computers, yet totally fail to realize the power of data analytics when you feed in massive amounts of data. The most disturbing point of all in this shit, is when you are refused a job, or a loan, or medical insurance, you wont ever know why, you will simply know you made the numbers go down for some reason, and people will become deathly afraid of that, so will take fewer risks in their life, and generally lead a very closely guarded, boring existence. ~~~ Spearchucker What is the probability of that happening to any individual? And what's the impact? At a guess I'd say the impact _can_ be high. The probability is probably quite low. If exposure = probability x impact, then exposure is (guessing) low to medium. ~~~ bradleyland You'd be amazed what what's _really_ present in the statistics. Humans commonly worry about the wrong things. Categorically, we are very poor judges of risk on a broad scale. In a single event, we're pretty good, but it cannot be emphasized enough that we suck hard at intuitive understanding of broad threats. Here's an article that points out quite a few things you wouldn't expect. It's about 2 years old, which isn't all that old for stats. These things take time to compile. [http://crimeinamerica.net/2010/12/13/what-are-my-chances- of-...](http://crimeinamerica.net/2010/12/13/what-are-my-chances-of-being-a- victim-of-violent-crime/) A relevant point: "Females knew their offenders in almost 70% of violent crimes committed against them (they are relatives, friends or acquaintances). If females make the best possible choices as to who they associate with (if they have a choice) their rates of violent crime drop considerably." So right off the bat, we can see that a significant majority of violent crime is committed by the person you know, not some random stalker on the internet. My point isn't that we should ignore internet privacy concerns. My point is that the fear mongering in the press is reprehensible. The obligation of the press is to inform the public. Today, it seems that the obligation of the press is to sell ad impressions, and scary plot lines... sorry, scary _editorials_ about internet rapists draws readers. ------ silentscope "These reports don’t for one minute think that this is just a fun app — an app that most people will run once, laugh heartily (or a little nervously), and then never look at it again." For most people the sentence above is true. But I don't think anyone's surprised to hear that there are dangerous people in the world. Yep, they exist and they can hurt people--they HAVE hurt people. Closing your eyes to something doesn't make it disappear, just harder to see. We're not worried about "cyber rapists" because there are no such thing as a "cyber rapists". Rape happens in the real world. This app won't change nightlife into a horrible dangerous place--but it doesn't make the world a better place either. This is the dark side of technology because it makes the world cheaper, less incredible. It's alienating and scary and makes people more cautious. It does society a disservice--the only people it does a service for are the people who's motives are questionable. I'm not saying you don't have a point. You do. We can discuss this without being so up-in-arms about it. But one extreme is just as bad as the other. When the counterpoint bar is set at 'We'll look at it then just delete it,' I think we can agree that this app is a problem. The question is how bad is it? ~~~ DanBC Many people were being raped before this app appeared. Many people will be raped after this app, but none of those will have any connect to this app. Most women who are raped are raped by someone who knows them. There are serious problems with privacy settings of various websites, and this app is a nifty creepy way to highlight those problems. But suggesting it is "rapetastic" is just hyperbole and unhelpful. ~~~ silentscope You're right. But as a society, it's up to us to make the world a better place, not excuse it for what it is. I have no idea what happens "mostly" with rape, nor would I use it as a reason why this app isn't bad. There are many problems with online privacy, and I partially blame every app that takes part. The app isn't "rapetastic." It's not inciting violence, it doesn't have a google maps "getaway" feature. It just makes the world more dangerous at worst and more mediocre at best. In cases such as this, I err on the side of more criticism, not less. But that's just me... ------ secoif Isn't this article doing the exact fear mongering it's accusing technology of doing? And ironically, via technology. ~~~ vrotaru Well, no. The original article was more like a book about the danger of books. This one is like a book about mostly-hamrlessness of books and more.
{ "pile_set_name": "HackerNews" }
Top Startups That TechCrunch Missed - Nov - morefranco http://www.startupplays.com/blog/top-35-startups-in-tech-that-techcrunch-missed-november-2012/ ====== Jabbles I propose an experiment to return to this page at various points in the future and comment on the status of these startups/websites. I realise that I may be adding to the background noise, but this is ridiculous: <http://imgur.com/F7wkO> ~~~ potatolicious It's weird that HN is now a thing to be gamed (in this case, poorly). I remember when this was just a forum for people to post links and talk about stuff, and wasn't nearly influential enough to be worth astroturfing and manipulating. It's not just this post either - I've seen first-hand other startups try to manipulate HN. It can be as mild as getting your colleagues to upvote a story, and it can be pretty loathsome like astroturfed comments. Re: the list though, I see a lot of FNACs here (Feature, Not A Company). I love a useful product as much as the next person, but are products that have no visible revenue potential really startups? ------ ebellity Even though we're in this list (HeyCrowd), I don't get why everyone gets so angry when seeing a list of startups and calling them "junk". It's just a lazy thought process to do that. It's true that most of them will probably fail anyway, so what's the point in destroying them when they're trying their best to bring a vision to life ? Personally, I have discovered a bunch of interesting products there. ~~~ dmbaggett Get used to it. It takes an incredible amount of work to build a "crappy" startup, but less than zero to post a snarky comment to HN that makes one feel superior. ~~~ srameshc Very true. I hope everyone who is trying to build something and want to show it to HN realize this very fact and keep working regardless of the kind of comments. ------ debacle Some of these start-ups look like actual start-ups with roads to profitability. Some of them look like side-projects. Either way, regardless of the value of these start-ups it's nice to pierce the YC bubble, if even a little bit. ~~~ blackdanube Which ones do you think have potential? I'm asking because I'm on the list and would like to know what you think. ------ Dirlewanger Let's be honest, people: a lot of these are junk and aren't going to be around in 6 months? Another Twitter client??? An application that finds me resorts??? Haven't seen that idea before! The start-up bubble continues to cometh. ------ huhtenberg So I take this is just a dump of a month worth of Betalist with TC features excluded? And what's up with all this astroturfing in the comments? Have some decency not hype your own startups, gentlemen. ------ josteink I would have clicked this list had Techcrunch not been the benchmark it was measured against. Techcrunch and Arrington is a hole of drama and self-obsession and best ignored. ------ muratmutlu Half of these are coming soon pages, the author of the article doesn't sound like he's tried any of the services in private beta, he's just made a list of what might be cool. Also is the eye-drop app a startup or a side-project? ------ demosten Guys sorry about not joining the monthly haters' meeting here but to me it looks like a nice list. I agree it's too much Betalist in style thought. ------ scottannan This is a fantastic list - keep it up! ------ RandallBrown I wish more of these were actually launched instead of just coming soon. As an iPhone/Mac developer, Objective-Cloud is very interesting to me. A few of the other apps, (I wouldn't call most of them startups) look pretty nice as well. ------ noinput Going to need another cup of coffee now, thanks franco! ------ nancyliang Awesome list! ... though, there goes my morning, going to be trying out these products. ------ jaysonlane Awesome list, great job Franco! ------ raldi Is it me, or is this site completely unreadable on an iPhone? ~~~ richardjordan Android too. Garbage. I get people not working to optimize for mobile but at least be readable on mobile surely. ------ tylercopeland Great list of some amazing startups! ------ chehoebunj Hypegram.... Take my money already.
{ "pile_set_name": "HackerNews" }