text
stringlengths 89
1.12M
| meta
dict |
---|---|
Ask HN: Where do you get your high quality stickers printed? - mojombo
The last batch of stickers I had printed up for GitHub were low cost but low quality. The image rubbed off of the paper backing (that is, if you could even get the thing <i>off</i> of the backing).<p>I'm going to be printing up a new batch and I want to get some high quality removable stickers made (so they can be put on laptops for instance). Does anybody have a provider that they like?
======
replicatorblog
These companies are all really good, just make sure to select vinyl printing
inks and not paper based stickers otherwise you may get some of the same
issues. These companies are all fairly small so you will get really good
service if you ask any questions:
<http://contagiousgraphics.com/Stickers/stickers.html>
<http://stickerrobot.com/> <http://vgkids.com/stickers.php>
------
spydez
A friend of mine is a designer who started a business doing stickers, business
cards, magnets, etc.
I've had one of his stickers on my Dell laptop for... probably about 2-3
years. I did not treat that laptop daintily, and his sticker still looks
nearly pristine; so he can certainly get you what you need.
His email is chris at trifectaagencies dot com.
------
Mistone
we did an order from www.printingblue.com/ and where happy with the cost and
quality, lead time was a bit long - 10 days - but overall satisfied.
------
gaius
moo.com
| {
"pile_set_name": "HackerNews"
} |
Thank you, Dave Winer (social network not graph) - bootload
http://www.roughtype.com/archives/2007/09/thank_you_dave.php
======
natrius
There's a good reason for having both terms. "Social network" can mean either
a site that lets people socialize with each other or one's actual
relationships with other people. It's usually used to mean the former. "Social
graph" is used to mean only the latter.
"Social networks" are the application. The "social graph" is the data.
~~~
alex_c
That's exactly what I was thinking. Seems like an obvious - if somewhat subtle
- difference.
I don't really see what the fuss is all about.
------
karzeem
Like the commenters at Rough Type say, "graph" is just the word for how
programmers conceptualize a network (as in graph theory). Winer is right that
there's no reason to expose consumers to that kind of jargon, though. The
reason that it's popping up is that Facebook has been using it since they
launched their platform in May, I suspect with the hope that the term will
remain closely associated with them so that when people think social networks
(or social graphs, as it happens), they think Facebook.
------
pietro
Lots of people sounding like monkeys in the comment section to the post...
------
mattmaroon
That term is borderline retarded. It's the internet slang equivalent of that
cousin of yours who got held back in the fourth grade.
| {
"pile_set_name": "HackerNews"
} |
Hidden portrait 'found under Mona Lisa', says French scientist - Perados
http://www.bbc.com/news/entertainment-arts-35031997
======
meesles
As someone else pointed out before, this was probably nothing noteworthy.
Artists always make multiple layers and improve upon a base, this is just
standard technique. This isn't some Da Vinci Code hidden Illuminati story.
~~~
wavefunction
They also reuse canvases because they tend to be poor and hey, they've already
got this perfectly serviceable canvas sitting around with something they don't
care about.
------
mariuolo
Couldn't they be pentimenti?
------
bammmmm
pics or it didn't happen (i mean the real layer back light see-through stuff).
preferably before and after substraction of what can be seen without the
backlighting..
| {
"pile_set_name": "HackerNews"
} |
The Best Ways to Get to Work, According to Science - awjr
http://gizmodo.com/the-best-ways-to-get-to-work-according-to-science-1733796033
======
eveach
tl;dr Here are the subheadings:
Driving is the most stressful way to commute
It’s also bad for your health
It’s bad for your relationships and community, too
Riding or walking to work makes you healthier and happier
... And the benefits vastly outweigh the risks
Happy commuting!
| {
"pile_set_name": "HackerNews"
} |
The traditional FPGA market is abandoned - jamesbowman
http://www.xess.com/blog/extinction-level-event/
======
lvh
I'm not sure I agree with the idea that Intel is going to make these folks
work on something else because FPGAs don't grow that much. That makes little
sense; Intel isn't really hurting for more employees, and the FPGA engineers
can't necessarily be made to be productive on the higher-growth sections
cited.
Instead, it makes a lot more sense that Intel is going to try and make that
market more lucrative and larger instead. As a major chip manufacturer,
they're in a great position to ship FPGAs to tons of new customers. Chicken-
and-egg problems (you don't FPGA until you need to because it's not available)
have made FPGAs a niche element. However, OpenPOWER/CAPI have demonstrated
that you can get huge benefits from slapping FPGAs to existing general purpose
compute.
So; TL;DR: I don't think it makes sense to assume Intel is just going to
assume that market will stay what it is. Instead, they think they can make
that market better, and do more than Altera/Xilinx can do individually. In
that light, a purchase makes perfect sense.
(Disclaimer: I work for a company involved in OpenPOWER/OpenCompute and has
shipped hardware that does this.)
~~~
Retric
FPGA's are always significantly worse than dedicated HW. To sell them _in
bulk_ you need to be able to replace a wide range of chips you don't need at
the same time with a good enough solution. At the same time you need enough
speed you can't just simulate things on a general purpose processor.
That's a fairly narrow band and the real issue with FPGA's. A synthesizer for
example seems like a good fit, but well general processes are fast enough. If
we had a wider range of video codec's that might work, but again general
purpose processors are good and dedicated hardware can help for the mainstream
codec's.
Any ideas?
~~~
dv_dt
Anything that is needed to run in a pipelined data flow - particularly if the
cpu processing of that data would be non-trivial. One example might be video
processing (the kind that might go in all those automated cars on the
horizon). You might have one set of algorithms at initial release, but in
future releases one might change them.
Other examples might be crypto or compression/decompression, transcoding ..
~~~
tarpherder
Wouldn't a GPU outperform a FPGA in those cases?
~~~
dv_dt
It depends - for outright throughput for something the GPU is good at perhaps,
but if Intel is coupling the FPGA tightly to their CPU and it has access to
the memory bus and maybe some level of cache, then (I'm really speculating
here) it might be very advantageous for situations where you have small chunks
of data that you want to operate on with very low latency (lower latency by at
least two PCIe transfers, to & from the GPU, and likely another memory
transfer between the card local memory and the GPU).
Then there are also likely a some set of functions you can directly implement
on an FPGA that a GPU simply didn't elect to implement in its hardware or
would need to compose out of a number of operations.
------
PaulHoule
People are still scratching their head over this acquisition but here is my
take.
Intel has failed at phones not because "x86 sux" but because Intel makes a
small number of SKUs that are often different bins of the same design -- these
are heavily tuned in every way. Phone chips are not so optimized in terms of
process and design, but instead they are true "systems on chip" where a number
of IP blocks are integrated to produce something which is optimizable by
adding functional blocks.
Something FPGA based, whether it ends up in the datacenter or elsewhere, could
have a highly optimized FPGA block which is field configurable, so this gives
Intel something to compete with SoC vendors on their own terms.
One detail is the nature of the memory and I/O systems. FPGAs can move a huge
volume of data, so unless you upgrade the the paths in and out of the CPU/FPGA
area, the FPGA would be starved for bandwidth.
It would take one of two things for the FPGA market to expand based on these
developments.
First, if there was some "killer app" that sold a huge number of units, that
would help. The trouble here is that if you sell a huge number of units, you
might be better off making an ASIC functional block an integrating it into a
true SoC.
The other one is that (I think) FPGA development today is constrained by the
awful state of tooling. Pretty soon you will be able to pop a FPGA hybrid chip
into a high-end Xeon workstation and have one hell of a development kit in
terms of hardware, but without a big improvement in the tooling, very few
people will be able to really make use of it.
~~~
rwmj
I think your analysis of Intel's problem is spot on. But FPGAs have something
like (IIRC) 4% of the density of custom-designed ASICs. Can Intel's superior
process create FPGAs at such small scales to overcome this rather large
disadvantage?
Agree about the terrible tooling. 'Twas always thus, unfortunately, even when
I was doing PLDs back in 1991.
~~~
PaulHoule
Well the economics get awful for designing chips as the feature size gets
smaller, which is keeping 28mm fabs pretty busy and will just get worse at 10
or 8 nm or whatever comes next.
------
dmytroi
Ehm, FPGA/CPLD are everywhere, it's hard to say, but my educated guess is that
hardware market consumes similar amount of SoC-like and FPGA/CPLD chips, look,
even Pebble smartwatch uses Lattice iCE40 FPGA, MacBooks also usually have few
FPGA's here and there. FPGA/CPLD's are essential to implement "glue-logic",
which does what it says - glue different pieces of hardware together. Of
course you can do "glue" in software, but than your customers will enjoy poor
battery life time and random glitches here and there. And nobody gonna spend
big-money (it's actually depends, sometimes it 1kk+$, sometimes you can get
away by 30-50k$) by running their own ASIC's when you can just buy jellybean
FPGA's.
Indeed phone/desktop market might move to more one-chip-for-everything
solution, but even then we need glue-ish logic to control something like
screen backlight DC-DC converters, charging IC's, etc, which is much more
easier done from FPGA/CPLD-like devices. On the other hand FPGA/CPLD's are
essential in some classes of devices, for example test instruments: modern
oscilloscopes usually have 3+ FPGA's in them, companies like Keysight usually
only run custom ASIC's when they hit limitation of current silicon tech, like
their N2802A 25GHz active probe (starting from 23500$) uses amplifier IC made
with indium phosphide process (InP), which is kinda far away from whatever
current consumer product companies are doing, you can check the teardown of
this beast here
[https://www.youtube.com/watch?v=jnFZR7UsIPE](https://www.youtube.com/watch?v=jnFZR7UsIPE)
So in my opinion FPGA/CPLD market will live long and strong, players might
change, but demand is enormous. The only problem in my opinion is that whole
market is more B2B-like (FPGA's are usually just a humble IC's inside end
customers products, you don't see stickers "Altera inside" or anything on
products themselves), so it's kinda hard to get grasp what's going on.
------
kevin_thibedeau
XESS is biased toward the segment of the market they play in which is centered
around computational applications where having a micro on board is usually
desirable. There are _many_ more places where simple glue logic is needed and
a small, cheap FPGA/CPLD fits the bill. One would hope that Altera's (and
possibly Xilinx's) new masters won't fuck this up and shutter their low end
devices but that doesn't mean there won't be demand for them into the future.
People with a need for these types of FPGAs aren't XESS customers so they are
blinded by the limited scope of their own business.
------
kqr2
In case the link is down:
[http://webcache.googleusercontent.com/search?q=cache:http://...](http://webcache.googleusercontent.com/search?q=cache:http://www.xess.com/blog/extinction-
level-event/)
Also, I believe that cheap microcontrollers have been able to replace FPGAs in
some cases.
~~~
LeifCarrotson
Cheap microcontrollers can replace old CPLDs, which now are technically FPGAs
but really don't have much of a place in modern hardware.
They can also replace FPGAs because even small microcontrollers are now really
systems-on-a-chip, with integrated SPI, CAN, LVDS, or even USB/transceivers.
To me, that's more like replacing an FPGA with dedicated hardware - it's just
that microcontroller peripheral options are so thorough that dedicated
hardware no longer needs to be custom.
~~~
ArkyBeagle
It's mixed. You can spin a fast loop in a PIC @ 1,2, 4 or 8 MSPS, and that
eats into FPGA territory.
USB is uniformly terrible; multi-PHY CAN offerings are disappearing. SPI can
be good if you have the right interface chip ( which you may or may not
actually know until you're running against a prototype ).
LVDS looks interesting, but it seems to have been a lot supplanted by SPI/I2C
and Ethernet.
I dunno; it's just different. Phones have warped the market in puzzling ways.
~~~
justaaron
spot-on re USB = terrible (for typical embedded usage- derived clock, latency,
fancy software/hardware stack support requirements. hardly a simple interface.
not worthy of universality) and of course, it's because of the phone SOC
market...
basically the entire high-end of embedded has been taken-over by poorly-built
linux distros running on phone SOC reference board designs...
as much as i'd love to support the 96boards effort, they are pushing us all
into using USB for things like audio codecs... (i'll take TDM'ed I2S anytime,
as at least there is a real bitclock as well as a frameclock, directly wiggled
chipside...)
~~~
ArkyBeagle
I use an inexpensive USB audio interface daily, and _that_ part isn't bad at
all. I'm able lto run two DAW against it; one for cue mix with VST plugins and
another for recording.
If you're careful with part selection and systems engineering on the host
side, it works extremely well.
I'm also loath to throw too many rocks at the SOC boards - they make fine
prototypes that can then be adapted to something more appropriate.
------
JoachimS
The article is based on the premise that Altera will stop developing FPGAs for
general availibility. Nothing I've seen (and would love to be shown otherwise)
so far indicates this.
FPGAs has been used as test designs when qualifying fabs for volume production
since they are regular, but more complex and closer match to general ASICs
than memories.
FPGAs are also often used more and more either with internal CPU cores (hard
or soft) or as a companion to CPUs providing acceleration, esp data plane
processing.
This means that the aquisition of Altera is mor of adding a business that
complements the CPU business, not remove a competitor from another market
segment. Intel can sell CPU+FPGA solutions for data center, big data. But it
can also sell more chips and increase utilization in their fabs.
And for FPGA users the FPGAs coming out of these fabs will probably be better
with higher density, lower power consumption that what Altera managed to
design themselves. And getting a Stratix or a Cyclone SoC with an Atom core
inside running Linux would be a very neat solution.
------
petra
Or there's another alternative. The main reason it was hard to make money from
the industrial etc segment - was low volume combines with a lot of support
costs.
But what if
a. xilinx built c-like tools that enabled embedded software engineers develop
easily ?
b. they released those freely to some segment of the market ?
c. they've built an external support and IP ecosystem, either open or closed
or both ?
Those actions can increase margins for xilinx, and they seem to be doing a,b.
As for the hardware, maybe the article is right. Also ,recent industrial chips
are using 28nm, and going beyond that is extremely expensive and might not fit
the industrial scenario anyhow, maybe there's not a lot of innovation left in
the industrial FPGA market,
~~~
chriscool
> Also ,recent industrial chips are using 28nm, and going beyond that is
> extremely expensive
Yeah, so ASICs that are not produced in big enough volume will not move to
14nm or less. This means that FPGA that can move to 14nm or less, because of
bigger volume, may become competitive against those ASICs.
~~~
petra
When i'm saying "industrial chips are at 28nm" i mean industrial FPGA's.
Will xilinx create industrial FPGA's at 14nm or beyond ?
First transistor cost will have to become meaningfully lower than 28nm
transistor costs. That only happens at 10nm. But at that node, NRE costs are
extremely expensive. Spread over low-cost low-end industrial chips - this
requires a huge volume, which xilinx probably doesn't have yet.
Also couple that with 28nm being more much reliable(all the failure mechanisms
increase at 10nm: electron migration from wires, thermal hot sports,
transistor fin self heating), and since reliability is key for industrial - it
would be hard to see industrial moving beyond 28nm.
------
payne92
I think it gets squeezed even more with general purpose GPU computing.
~~~
exelius
There's no reason you can't have general-purpose SoCs that combine traditional
pipeline-driven CPU cores, GPGPU cores and FPGA cores with memory, bus chips,
storage, etc. We long ago reached the point of diminishing returns for
cramming more raw computing power on a chip; so chips of the future will
likely consolidate specialized features for various workloads in the name of
power savings / speed increases for certain workloads.
~~~
rjsw
Plus Intel owns the IP to all the pieces and is happy to provide documentation
for them.
Xilinx have Mali GPUs on their latest ARM+FPGA hybrids but that isn't much use
to me if I can't find out how to program it.
| {
"pile_set_name": "HackerNews"
} |
Hey, Tesla Fans: Drive the Porsche Taycan Before You Criticize It - clouddrover
https://www.thedrive.com/opinion/31091/hey-tesla-fans-drive-the-porsche-taycan-before-you-criticize-it
======
fastbeef
Just venting, but holy hell are cars the stupidest “hobby” I can imagine.
~~~
LeftHandPath
Are they though?
The more I’ve read about human psychology - particularly about “flow” - the
more valuable manually driven cars seem.
Similarly to how many people are actually happiest / have the highest sense of
self-worth while at work (even though they’d scream otherwise), a lot of
people are able to relax and think freely while driving. And in general,
humans seem to enjoy having control over G-force (whether jumping off of
cliffs, flying planes, riding roller coasters, driving...)
Flow:
[https://en.m.wikipedia.org/wiki/Flow_(psychology)](https://en.m.wikipedia.org/wiki/Flow_\(psychology\))
Interesting take on automotive tech and psychology from Nicholas Carr’s “Glass
Cage”, reviewed by NY Times:
[https://www.nytimes.com/2014/11/09/books/review/the-glass-
ca...](https://www.nytimes.com/2014/11/09/books/review/the-glass-cage-by-
nicholas-carr.html)
~~~
fastbeef
I never thought of it this way. Thank you for providing some perspective, it
made me reflect on my thoughts on the subject.
| {
"pile_set_name": "HackerNews"
} |
Testing Without Mocks: A Pattern Language - jdlshore
http://www.jamesshore.com/Blog/Testing-Without-Mocks.html
======
miensol
I like your article a lot. I do have a hard time to map the layers you've
described (Application, Logic, Infrastructure) to a mobile/frontend/ui
application.
For instance in Android we've Activities/Fragments (and less commonly Views)
as the entry points to the application code. From there we might have [View
Models]([https://developer.android.com/topic/libraries/architecture/v...](https://developer.android.com/topic/libraries/architecture/viewmodel))
and External API clients that are clearly _infrastructure_.
From what I've seen it is common to call the External API clients from
ViewModels directly. (This is so we can test more logic outside of Android
runtime, AFAIK.)
If I get the layers you've described we should instead let Activity/Fragment
ask the ViewModel for some data and then use it to call the External API,
correct?
~~~
jdlshore
Thanks, I'm glad you like it. I haven't done any Android work, so I'm not sure
how the A-Frame Architecture applies. From my quick glance at the ViewModel
documentation you linked, that looks like Application layer code: in the first
example in that document, the `loadUsers()` call would turn around and call
the actual a specific Infrastructure Wrapper responsible for communicating
with the service that provides the users.
------
JayBazuzi
> there's no need to create interfaces for dependencies
I think in this context you're using `interface` in the way that C# uses it: a
point of abstraction and indirection.
I think you aren't using it in the API sense: the external surface of a thing.
If that's correct, you may want to clarify your wording. Perhaps "abstract
interface".
~~~
jdlshore
I meant in the term of the language construct. I'll think about how to make it
more clear...
~~~
jdlshore
I ended up just taking that paragraph out. I don't think it added much.
------
wpannell
Interesting innovation. Coincides with 2nd Edition of Martin Fowler's
"Refactoring," for which he chose to code in javascript. Interested in
feedback from the mocking pioneers — Steve Freeman, Nat Pryce, Keiþ
Braiþvvaite — and, of course, the father of TDD — Kent Beck.
Is there sample code illustrating the pattern language?
~~~
jdlshore
Other than the sample code in the article, my "WeeWikiPaint" project
demonstrates most of the infrastructure patterns, particularly in its server
code:
[https://github.com/jamesshore/lets_code_javascript/tree/epis...](https://github.com/jamesshore/lets_code_javascript/tree/episode614/src/server)
------
JayBazuzi
> This can result in multiple tests failing when a bug is introduced.
One way I approach this is "a bug will only cause one test to fail _at that
level of abstraction_".
------
JayBazuzi
I don't see ports-and-adapters (aka Hexagonal Architecture) mentioned. Is that
because you don't like it / don't find it useful?
~~~
jdlshore
I don't feel like I fully understand it.
~~~
JayBazuzi
P-and-A makes Simulators possible, which are the one kind of test double I'm
happy with. But your pattern language seems complete without it, so maybe I
don't need it as much as I thought.
Simulators fit right where you have Nullable Infrastructure. Having a null
object and a real object, instead of a single object with both null and real
behaviors is how I'd normally organize things. I write my Focused Integration
tests so they can be run against both the real and the simulator, to ensure
that they both agree on the contract. And just like the Nullable
Infrastructure, I can use the simulator in tests.
However, the real with with Ports-and-Adapters isn't the simulators (even
though everyone notices that first). It's the way a good Port abstraction lets
you organize your code well, writing it in terms that make sense in your
domain, instead of in terms of the dependency.
This is worth exploring further.
~~~
jdlshore
One of the key ideas of the Nullable Infrastructure is that all the semantics
of the infrastructure class remain _exactly the same_ other than disabling
external communication. So any logic inside the infrastructure code still
runs. This is important because we don't have broad integration tests.
That's why the null code is inside the infrastructure, instead of using the
classic Null Object pattern. It's also a major difference between it and
classic test doubles. And perhaps your simulators?
| {
"pile_set_name": "HackerNews"
} |
Google’s not-so-secret new OS - techenthusiast
https://techspecs.blog/blog/2017/2/14/googles-not-so-secret-new-os
======
michaelmrose
It was unfortunately obvious that the writer had insufficient tech chops when
use the phrase
"a post-API programming model"
But pressing on how somehow manages to blame the lack of updates to android
phones on the modularity of the Linux kernel. The joke of course being that
linux is monolithic and googles new OS is a microkernel ergo more modular.
The quote is "...however. I also have to imagine the Android update problem (a
symptom of Linux’s modularity) will at last be solved by Andromeda"
Its hilarious that he can somehow defying all sanity ascribe androids update
issue to an imagined defect in Linux. Android phones don't get updated because
for the manufacturers ensuring their pile of hacks works with a newer version
of android would represent a non trivial amount of work for the oem whom
already has your money. The only way they can get more of your money is to
sell you a new phone which they hope to do between 1-2 years from now.
In short offering an update for your current hardware would simultaneously
annoy some users who fear change, add little to those who plan to upgrade to a
new model anyway, decrease the chance that a minority would upgrade, and cost
them money to implement.
Its not merely not a flaw in the underlying linux kernel its not a technical
issue at all.
~~~
pjmlp
> The only way they can get more of your money is to sell you a new phone
> which they hope to do between 1-2 years from now.
The thing is, this only works on countries similar to US where most people are
on contracts.
In the rest of the world, where people are on pre-paid, we use our phones
until they either die or get stolen, which is way more than just 1-2 years.
~~~
TeMPOraL
Bump it to 3-4 years then. Bloated manufacturer updates combine with bloat in
most popular applications and the regular web bloat to make the phone unusably
slow after few years.
Between that and fragility of smartphones (mechanical damage, water damage),
most people are bound to replace theirs every few years anyway.
~~~
pjmlp
> Bump it to 3-4 years then. Bloated manufacturer updates combine with bloat
> in most popular applications and the regular web bloat to make the phone
> unusably slow after few years.
My S3 is 4 years old now, and it is works perfectly fine.
When it dies, I will most likely adopt one of my Lumia devices as main one, or
will buy a 2nd hand Android device, instead of giving money to support bad
OEMs
~~~
TeMPOraL
My SO's S3 is of more-less the same age and it's so slow that it's barely
usable now. Still can't track down why - she is not a power user, she wasn't
installing apps beyond the few things I installed her and the OS updates. My
old S4, currently used by my brother, suffered the same fate, being slow even
after a factory reset. I wonder where this comes from?
~~~
pjmlp
I had some performance issues, but they got sorted out when swapped the
battery for a new one.
~~~
TeMPOraL
A _battery_? How would that help? Not doubting you, just honest question. I
feel like I'm missing something very important about how smartphones work.
~~~
the_af
Mmm... it also happened to me with my (now defunct) Galaxy S1. I can't explain
it, but at some point its battery developed the "bloated, about to explode"
look and the phone worked but it was very slow and crashed frequently. I
changed the battery and everything was ok. Later it died of unrelated causes.
------
MarkMc
It bothers me that Google does not seem particularly interested in doing the
one thing that would make their Android platform absolutely dominant: Allow
Chrome to run Android apps on Mac and Windows.
Google has already done 90% of the necessary work by adding Android apps to
ChromeOS. _Two and a half years ago_ it created "App Runtime for Chrome" which
demonstrated that Android apps could run on Windows and Mac in a limited,
buggy way [1]. If Google had put meaningful effort into developing such a
strategy we would by now have a relatively simple way to develop software
which runs on 99% of laptops and 85% of smartphones and tablets. Developers
would now be targeting 'Android first' instead of 'web app first then iOS then
maybe Android'.
Sundar, if you're reading this - do it!
[1] [https://arstechnica.com/gadgets/2014/09/hack-runs-android-
ap...](https://arstechnica.com/gadgets/2014/09/hack-runs-android-apps-on-
windows-mac-and-linux-computers/)
~~~
mark242
Sun tried that, back in the day. Maybe you heard about Java applets, maybe you
didn't. They were the slowest thing about the web, insecure even with a
sandbox, and just an overall pain. Short of having a jvm always running on
your machine, the performance of Android-via-Chrome will completely turn
people off the Android ecosystem.
~~~
josefx
I remember two issues with Java Applets:
* Java itself was slow for a long time
* The Browser would hang while loading an Applet
The first is no longer an issue. They can just use a modern just in time
compiler and it wont run slower than Java on Android. Chrome already has one
to deal with JavaScript powered Web 2.0 applications.
The second was as far as I can tell an API issue. Applets would block
everything by default until they were loaded. A really bad idea in a single
threaded environment when you had to send several MB over low bandwidth and
the JVM itself took long to start. Just making the load async with a
completion callback could have solved this issue and I remember a few Applets
that actually used an async download to reduce the hang.
~~~
username223
You missed the biggest issue: "write once, mediocre everywhere." Windows, Mac,
and X were all different, and Java Applets were necessarily bad at emulating
all of them. While there are fewer Unices today, there are more GUIs, and
cross-platform apps suck at least as much.
~~~
MarkMc
"Write once, mediocre everywhere" was a problem with Sun's implementation, not
with the concept of cross platform code. There are tons of webapps which are
very successful, despite being written 'once'.
In any case, Google doesn't need to be as strict as Sun was. It is free to
implement "write 90% of your code once and 10% customised for each platform".
~~~
flukus
> There are tons of webapps which are very successful, despite being written
> 'once'.
Actually they suffer from most of the same problems, only computers have
gotten faster (masking performance issues) and our expectations have lowered.
How many of these web apps obey the native OS themeing for instance?
~~~
mercer
1) The fact that webapps _can_ run relatively well strikes me as hopeful,
considering how much more inefficient using HTML/CSS/JS is compared to Java
applets. Or is the latter not the case (honest question)? 2) I'm not sure if
our expectations have lowered much. Perhaps it's more that _mobile_ interfaces
are generally simpler and thus easier to make 'native enough'?
Although I think there's more going on in regards to 2. I was never bothered
so much by the UI of a java applet looking different. What bothered me was
that even very fundamental stuff like input fields and scrolling felt both
alien and shittier than native. And while it's certainly possible to make a
web app just as shitty, if you rely on 'stock' html elements, a lot of the
subtle native behavior carries over.
Just a few weeks ago, for example, I built a web-app for mobile devices. It
felt off immediately because the scrolling didn't feel right. All I had to do
was turn on the momentum scrolling (with a line of ios-specific css), and the
scrolling suddenly felt native. Had I used a hypothetical Java applet
equivalent, I might've had to either go for a non-native-feeling scroll or
build it myself.
While I of course can't prove any of this, I think what people care about is
that things _feel_ native, not the 'skin' used to display it.
~~~
flukus
> considering how much more inefficient using HTML/CSS/JS is compared to Java
> applets. Or is the latter not the case (honest question)?
It's a really interesting question actually because it's so hard to compare
the two. On any objective measure, today's web apps are much better than
applets in terms of responsiveness, etc. But then again, an applet could run
on machines with 16MB of RAM total. I think you'd be hard pressed to get plain
html page in a modern browser to run on a machine like that. Either way, in
both cases we had a much better solution in native apps.
> 2\. I was never bothered so much by the UI of a java applet looking
> different. What bothered me was that even very fundamental stuff like input
> fields and scrolling felt both alien and shittier than native.
Modern web apps can score better here, but quite often they don't. The more
complex the become the less native they get, scrolling, text input, etc are
generally OK (unless your an arshole that overrides scroll behaviour), but
html still doesn't have an equivalent for native table views and the goodies
(navigation, resizing, performance) that comes with them.
For me the skinning does matter though, I have a beautiful, consistent desktop
that browsers (not even electron apps) shit all over. When something doesn't
look quite right from the second you open it it magnifies all the other
differences.
~~~
mercer
> Modern web apps can score better here, but quite often they don't. The more
> complex the become the less native they get, scrolling, text input, etc are
> generally OK (unless your an arshole that overrides scroll behaviour), but
> html still doesn't have an equivalent for native table views and the goodies
> (navigation, resizing, performance) that comes with them.
Oh yeah, complex UI stuff is definitely a good reason to avoid web apps.
But for many, probably even most apps it's precisely scrolling, text input,
and other 'basic' stuff that matters, and in those cases a web app's 'default'
will be more native.
> For me the skinning does matter though, I have a beautiful, consistent
> desktop that browsers (not even electron apps) shit all over. When something
> doesn't look quite right from the second you open it it magnifies all the
> other differences.
I agree on a personal level, but I suspect we're outliers. Can't substantiate
that at the moment though, so I might be wrong.
------
vii
It's awesome that Google is doing this and in public too
[https://fuchsia.googlesource.com/](https://fuchsia.googlesource.com/)
Unfortunately, the hard part of an operating system isn't in a cool API and a
rendering demo. It's in integrating the fickle whims of myriad hardware
devices with amazingly high expectations of reliability and performance
consistency under diverse workloads. People don't like dropped frames when
they plug in USB :) Writing device drivers for demanding hardware is much
harder than saving registers and switching process context. The Linux kernel
has an incredible agglomeration of years of effort and experience behind it -
and the social ability to scale to support diverse contributors with different
agendas.
Microsoft, with its dominant position on the desktop, famously changed the
'preferred' APIs for UI development on a regular cadence. Only Microsoft
applications kept up and looked up to date. Now Google has such a commanding
share of the phone market - Android is over 80% and growing
[http://www.idc.com/promo/smartphone-market-
share/os](http://www.idc.com/promo/smartphone-market-share/os) \- they have a
huge temptation to follow suit. Each time that Microsoft introduced a new
technology (e.g.
[https://en.wikipedia.org/wiki/Windows_Presentation_Foundatio...](https://en.wikipedia.org/wiki/Windows_Presentation_Foundation)
WPF) they had to skirt a fine line between making it simple and making sure
that it would be hard for competitors to produce emulation layers for.
Otherwise, you could run those apps on your Mac :)
There are many things to improve (and simplify) in the Android APIs. It would
be delightful to add first class support for C++ and Python, etc. A project
this large will be a monster to ship so hopefully we'll soon (a few years) see
the main bits integrated into more mainstream platforms like Android/Linux -
hopefully without too much ecosystem churn
~~~
npsimons
> It's in integrating the fickle whims of myriad hardware devices with
> amazingly high expectations of reliability and performance consistency under
> diverse workloads.
So much this; Linux Plumbers conference years ago was bitching about how every
gorram vendor wanted to be a special snowflake, so even though the
architecture was ARM, you basically had to port the kernel all over again to
every new phone. I haven't kept up with it, but I can't imagine it's gotten
better. The problems they're listing as reasons to move to a new kernel aren't
caused by Linux and they won't go away until you slap the vendors and slap
them _hard_ for the bullshit they pull, both on developers and users.
As for kernel ABI, this has been rehashed to death: just release your fucking
driver as open source code, and it will be integrated and updated in mainline
_forever_ :
[http://www.kroah.com/log/linux/free_drivers.html](http://www.kroah.com/log/linux/free_drivers.html)
~~~
bjackman
Overall I agree with your sentiment but it's not just a case of "releasing
your drivers" but also of getting it accepted by maintainers. If you don't
have an awareness of this process from the beginning of your development cycle
then it can be a massive amount of work.
------
conradev
The drivers-wifi repository contains a stub for a Qualcomm QCA6174 driver[1]
which is found in the Nexus 5X[2], OnePlus 2[3] and meant for smartphones[4].
The drivers-gpu-msd-intel-gen repository contains drivers for Intel 8th and
9th gen integrated graphics[5]. I think it's fair to propose that Google plans
on running Fuchsia on both smartphones and laptops…
[1] [https://github.com/fuchsia-mirror/drivers-
wifi/blob/master/q...](https://github.com/fuchsia-mirror/drivers-
wifi/blob/master/qualcomm/driver.cc)
[2]
[https://www.ifixit.com/Teardown/Nexus+5X+Teardown/51318#s112...](https://www.ifixit.com/Teardown/Nexus+5X+Teardown/51318#s112148)
[3]
[https://www.ifixit.com/Teardown/OnePlus+2+Teardown/45352#s10...](https://www.ifixit.com/Teardown/OnePlus+2+Teardown/45352#s100455)
[4] [http://www.anandtech.com/show/7921/qualcomm-announces-
mumimo...](http://www.anandtech.com/show/7921/qualcomm-announces-
mumimo-80211ac-family-increasing-the-efficiency-of-80211ac-networks)
[5] [https://github.com/fuchsia-mirror/drivers-gpu-msd-intel-
gen/...](https://github.com/fuchsia-mirror/drivers-gpu-msd-intel-
gen/blob/master/src/device_id.h)
------
resoluteteeth
Is this an actual plan of Google as a company, or is this some sort of
Microsoft-style war between divisions where the Chrome team has just decided
on its own that the future is based on Chrome and Dart?
Also, considering the way that the ARC runtime for Chromebooks was a failure
and had to be replaced by a system that apparently essentially runs Android in
a container, will it really be possible for a completely different OS to
provide reasonable backward compatibility?
~~~
throwawaydbfif
I would say that Google is trying to replace JavaScript with dart in any way
they possibly can. The reason is simple, JavaScript is an open standard, dart
is owned by google.
Their reasons that "dart is better" is the typical google koolaid before they
attempt a market takeover. As we've seen over and over with Android, chrome,
and AMP especially. Google loves to make glass house open source projects you
can't touch. You're free to look at how great it is, feel it's well refined
curves and admire the finish, but God help you if you don't like how the
project is going and want to fork it for yourself.
Don't bother trying to commit a new feature to any of Google's software that
they don't agree with. It will languish forever. Don't bother forking either,
because they'll build a small proprietary bit into it that grows like a tumor
until it's impossible to run the "open source" code without it.
Fuck dart, I don't care how great it is. Microsoft is being the good one in
this case by extending js with typescript, google is trying to upend it into
something that they control
~~~
isoos
Looks like alternative facts have reached the tech world too?
You can take as hard look at Google as you would like, but choosing Microsoft
over Google (one for-profit company over another), while not caring how the
technology, the licensing or the workflow compares is a bit hypocrite.
(e.g.they are both open, and they both have rules of commits).
I'm wondering, why do you need a throwaway for such heavily invested FUD? Your
other comments here are in similar tone, and I'm surprised to see such hatred
without any obvious trigger. Maybe if you would come forward with your story,
it would be easier to discuss it?
disclaimer: ex-Googler, worked with Dart for 4+ years, I think it is way ahead
of the JS/TS stack in many regards.
~~~
logicchains
>I think it is way ahead of the JS/TS stack in many regards.
In what ways do you consider it ahead of Typescript? Personally as someone
who's particularly fond of static type systems (Haskell and the like),
Typescript's type system seems way more advanced and powerful than Dart's
(union and intersection types, in particular, and non-nullable types). Map
types (introduced in Typescript 2.1) also seem pretty interesting.
~~~
isoos
Some of my earlier notes are in this thread (it is more about the day-to-day
feature I actually use and like, and less about the fine details of the type
system)
[https://news.ycombinator.com/item?id=13371009](https://news.ycombinator.com/item?id=13371009)
Personally I don't get the hype around union types: at the point where you
need to check which type you are working with, you may as well use a generic
object (and maybe an assert if you are pedantic).
Intersection types may be a nice subtlety in an API, but I haven't encountered
any need for it yet. Definitely not a game-changer.
I longed for non-nullable types, but as soon as Dart had the Elvis-operator
(e.g. a?.b?.c evaluates null if any of them is null), it is easy to work with
nulls. Also, there is a lot of talk about them (either as an annotation for
the dart analyzer or as a language feature), so it may happen.
Mapped types are interesting indeed. In certain cases it really helps if you
are operating with immutable objects, and mapping helps with that (although
does not entirely solves it, because the underlying runtimes does allow
changes to the object).
~~~
throwawaydbfif
I agree about union types. They can quickly result in insane variable
declaration statements that are hard to understand.
I dislike nulls though, I always wish people would just use a flag or error
handling when objects are undefined, instead of "hey this object is the flag
and sometimes it's not actually an object!"
You'd think language designers would learn after dealing with null pointers :)
------
maheart
How credible is this source?
I don't understand half the decisions outlined in the article.
> I also have to imagine the Android update problem (a symptom of Linux’s
> modularity)
I seriously doubt the Linux kernel is anything but a minor contributor to
Android's update problem. Handset developers make their money by selling
physical phones. In two years, your average consumer probably doesn't care if
their device is still receiving software updates. They'll jump onto a new
phone plan, with a fresh, cool new mobile, with a better screen, newer
software (features!), and a refreshed battery.
Maintaining existing software for customers costs handset manufacturers $$$,
and disincentives consumers purchasing new phones (their cash cow). The money
is probably better spent (from their POV) on new features, and a marketing
budget.
~~~
felixge
> In two years, your average consumer probably doesn't care if their device is
> still receiving software updates. They'll jump onto a new phone plan, with a
> fresh, cool new mobile, with a better screen, newer software (features!),
> and a refreshed battery.
This might be true for the US, where 75% of subscribers are on post-paid
(contracts). It's not true for the rest of the world.
* Europe: < 50% post-paid
* Rest of the world: < 22% post-paid
I'd also argue that Android users will be more likely to be pre-paid than
post-paid customers (compared to iPhone users) in all of these regions, but I
have no data to back it up.
Anyway, I agree that it's probably not very profitable, if at all, for android
handset makers to support their devices for > 2 years. But I think many
customers would benefit from it ...
[1] [http://www.globalrewardsolutions.com/wp-
content/uploads/GRS-...](http://www.globalrewardsolutions.com/wp-
content/uploads/GRS-Mobile-Top-up_Wireless-Market-Statistics-2015.pdf)
~~~
ianai
I think the "post paid" connection leading to a 2 year lifecycle is suspect.
There's a big after market repair industry in the US. Many people have a
singular cell phone as their internet device - and it's often ancient by IT
terms. 2+ year old hardware needs to be getting software/OS upgrades.
~~~
majewsky
I favor the proposal of requiring a prominent "Best Before" date for new
devices, indicating how long the manufacturer will guarantee the availability
of security updates.
~~~
ianai
That's an SLA and we should all b getting them.
------
untog
If nothing else comes out of this, I hope we end up with an Android OS that
works better than the current one.
I've been running Android since the Nexus One so I'm no newbie to the
platform, but the ease with which iOS manages to get all UI interactions at
~unnoticable FPS and outstanding battery life is staggering when you're used
to Android. It feels like some really fundamental choices were made badly on
the platform that make it incredibly inconsistent and unreliable. A fresh
start would be fantastic.
~~~
Inconel
As a fellow Nexus user, I've owned the Nexus One, 5, and currently use a 6P,
how much of this is due to the OS versus hardware? Will Google ever be able to
achieve Apple level battery life or overall UI smoothness, not to mention
update support, without having their own custom SoC?
I was very happy with the 5, even with the intermittent lags, especially
considering it's price at release. I suppose I'm not a very heavy phone user,
and I never play mobile games, but I've been very happy with the 6P on Android
6.0-7.1. Battery life could definitely be better, and it does get fairly warm
at times, but overall it's been a very good experience for me considering the
Snapdragon 810 it's using is generally poorly regarded.
~~~
scott_karana
Apple has had fluid UIs since the start, despite off-the-shelf, low resource
Samsung SoCs. They only started making custom ones after with the 4S, as I
recall.
~~~
fixermark
Yep. Out of the starting gate, Apple forced tight constraints on background
and multi-threaded processing---so tight that the first versions of the iPhone
OS couldn't support some types of application that the Android OSs could
unless Apple wrote the program and could take advantage of the private APIs in
the OS. But the advantage to that B&D approach was responsiveness and battery
life, relative to an Android OS where any developer could write an app that
would spawn an ill-behaved background thread and suck your battery.
------
mncharity
> I don’t see the average garbage-collected language using a virtual machine
> allowing for a target higher than 60fps realistically.
But... " _average_ garbage-collected language using a virtual machine" doesn't
describe _any_ of C/C++, Dart, Go, Java, Python, or Rust. Nor Javascript.
I get greater than 60 fps with my existing Vive three.js WebVR-ish
electron/chromium linux stack. Even on an old laptop with integrated graphics
(for very simple scenes). Recent chromium claims 90 fps WebVR, and I've no
reason to doubt it. So 60 fps "up to 120fps" seems completely plausible, even
on mobile.
~~~
saghm
> But... "average garbage-collected language using a virtual machine" doesn't
> describe any of C/C++, Dart, Go, Java, Python, or Rust.
I'm curious; what would be an example of something you would describe as an
"average garbage collected language using a virtual machine"? Java would
certainly be the first language I'd think of for that description.
~~~
munificent
> I'm curious; what would be an example of something you would describe as an
> "average garbage collected language using a virtual machine"?
Ruby 1.8, Lua, or CPython.
~~~
saghm
Python was included in the list of things considered not to be in this
category; I probably agree with you on that one, but the idea behind my
question was that the languages listed as not "average garbage-collected
language using a virtual machine" included several that I'd include in that
category.
What do you think makes Ruby and Lua more "average" than Java?
------
grizzles
Since Fuscia is a new kernel, that means it will probably only support Google
hardware.
The status quo right now among android hardware vendors is to violate the GPL,
and they have faced few if any repercussions for doing so. I wonder if Fuscia
is sort of viewed as the way forward to addressing that.
Anyone care to speculate why there isn't a community version of chromium os?
I'd donate to it for sure. It sounds like getting android apps working on it
would be pretty easy:
[https://groups.google.com/a/chromium.org/forum/?hl=en#!topic...](https://groups.google.com/a/chromium.org/forum/?hl=en#!topic/chromium-
os-discuss/OfBln-hl7ug)
~~~
bitmapbrother
>The status quo right now among android hardware vendors is to violate the GPL
No it's not the status quo. The major OEM's do release their code. Yes, there
are some Chinese OEM violators, but that's typical of China.
~~~
grizzles
You can release code and still violate the GPL in other ways. For example,
there are binary blobs out there and the GPL is pretty unequivocal on this
point: "The source code for a work means the preferred form of the work for
making modifications to it."
~~~
simonh
It depends whether the binary blob is a derived work or not.
[http://yarchive.net/comp/linux/gpl_modules.html](http://yarchive.net/comp/linux/gpl_modules.html)
------
bitL
Seems like Free Software that propelled early Internet pioneers served its
purpose and those companies are turning their backs on it - first with Apple,
GCC->LLVM, now with Google, Linux->Fuchsia :( I am getting afraid of another
dark age on the horizon... I guess it's going to be inevitable as 90% of SW
developers will find themselves redundant when inferring AI capable of
composing code blocks and listening to/reading speech/specifications arrives
in upcoming decade, making creation of typical web/mobile apps trivial.
~~~
bashtoni
I think Google is probably the most Free Software friendly of the the new big
three (Amazon, Google, Microsoft). They haven't disappointed with Fuschia
which appears to be entirely copyleft:
[https://fuchsia.googlesource.com/magenta/+/master/LICENSE](https://fuchsia.googlesource.com/magenta/+/master/LICENSE)
[https://fuchsia.googlesource.com/fonts/+/master/LICENSE](https://fuchsia.googlesource.com/fonts/+/master/LICENSE)
~~~
Asooka
That's BSD-license though, which doesn't force companies to respect your
freedoms. Parent was talking exactly about GPL being dropped in favour of
licenses that allow vendor lock-in. It means a philosophical departure from
user-first towards corporation-first and the Free Software world the FSF
envisioned getting trampled.
The zeitgeist is moving towards conservatism in general, so it doesn't
surprise me, but it's still sad.
------
techenthusiast
Article author here. Posted this in the notes, but possibly too buried:
For anyone interested, I intend to write quite often about consumer technology
on this blog. Topics will include hardware, software, design, and more. You
can follow via RSS or Twitter, and possibly through other platforms soon.
Sorry for the self promotion!
Thanks for reading. Please do send any corrections or explanations.
~~~
jm_l
Hiroshi Lockheimer has publicly stated several times that there is no merger
of Chrome OS and Android.
[https://chromeunboxed.com/some-andromeda-perspective-
hiroshi...](https://chromeunboxed.com/some-andromeda-perspective-hiroshi-
lockheimer-emphatic-on-separate-oss-moving-forward/)
I think you alluded to this, "cue endless debates over the semantics of that,
and what it all entails," but it might be worthwhile to add the official
statement.
~~~
ktta
But there are frequent commits to multiple repositories on the fuchsia code
base[1]. I don't really see where Google is going with this, if it's neither
meant to replace chrome OS or Android.
Maybe a long term project? I think Google is at a position where they can
write a great OS from scratch, learning from the mistakes of others, and it
has a chance of becoming the greatest OS that ever was.
With the talent of it's engineers, they can bring new ideas that can be better
implemented, from scratch on a new OS. They already have a bunch of languages,
web frameworks, and so many more technologies from Google that can be well
integrated in this.
And looks like the project is mostly BSD licensed, which is great! I'm excited
for just that alone.
[1]:[https://github.com/fuchsia-mirror](https://github.com/fuchsia-mirror)
------
thinkloop
Why can't the pure web replace apps and programs? All the pieces are almost
there: hardware acceleration, service workers, notifications, responsive
design...
I currently "add to home screen" for most things. I edit my images online, and
develop code using cloud9 ide, etc. There are few things I need apps/programs
for right now, and that's improving day by day.
iPhone is dropping heavily in world wide market share, but they still have a
lot of the wealthy users. There is a non-zero chance they get niched out of
prominence by Android (aka every other manufacturer in the world), at which
point network effects start encouraging Android-first or Android-only
development. There might be a point where Apple needs to double down on the
web, and/or maybe kill off apps, like they did flash, to still have the latest
"apps".
~~~
Sanddancer
I take photos miles from where there's cell signal. I write code on the bus
while heading to doctors appointments. The web is about as far from a panacea
as you can get. It's slow, it's bloated, falls apart when you don't have a
connection, useful applications die when the company dies. Were some of the
midi devices I use for music "web-based" they'd have probably become doorstops
decades ago. A web-based IDE would be horrible for trying to develop code with
an intermittent connection. The web is not a good time.
~~~
fauigerzigerk
The intermittency issues can be fixed but I agree that the dependency on web
app providers and their fickle business models is scary.
The way it works is to funnel all the profits into a few huge conglomerates
that benefit from exclusive access to all personal data and train users to
never depend on anything that isn't a core product of one of these
conglomerates.
Using their 80% margins they can afford to at least give us some time before
scrapping software that doesn't look it's ever going to reach 4bn consumers.
The result is stability. Until they all get toppled by the next technology
revolution. Years later, regulators will crack down hard on some of the side
issues of their former dominance and once again miss the currently relevant
issues :)
------
therealmarv
This blogpost has waaaay too much assumptions. When reading about this it
seems easy to rip out Kernels, OS & Software and put it like a layer on a cake
on top of a new OS. Even for Google this is crazy complicated. It will not be
that easy. For sure not... and I also see no clear strategy WHY somebody
should do that. It's like baking the cake with too much ingredients. ;)
~~~
djsumdog
I can see the kernel thing happening. Just the licensing and breaking ABI is
one of the biggest factors in not being able to have an easily upgradable
android.
I only see this as a good thing if this ensures an easier upgrade path than in
Android; and if vendor ROMs can easily be replaced by a stock OS (like on
Windows).
~~~
therealmarv
I definitely can not see the Kernel thing happen. Ever thought of power
management and keeping the whole system fluent? This are all not easy problems
which you solve in 1 or 2 years. It may only work for very specialized
hardware... speaking of hardware. Hardware driver support is also something
most other Kernels suffer from in comparison to e.g. Linux.
~~~
Roy0
> Hardware driver support is also something most other Kernels suffer from in
> comparison to e.g. Linux.
So?
Google doesn't have to support all hardware, they can pick to support only the
hardware they want. That's what they already do with ChromeOS. Installing
ChromiumOS on unsupported hardware can have its issues. The reverse is true
too, installing not-ChromeOS Linux or another OS on Chromebook does not always
work well, although it's fine on some specific models.
Android is like that too, and in a much worse way than for Chromebooks. We're
not talking about stellar linux kernel support for all the custom ARM SOC that
are out there. All manufacturers write their own closed source hardware
support for android and this is how android ends up having issues with
updating, since whenever Google updates the linux kernel it breaks the ABI and
all the support manufacturers wrote for the previous version, and
manufacturers do not want to spend so much time on needless busywork such as
keeping up with kernel API churn that exists just to satisfy the dev team
sense of perfection.
------
Insanity
> IDEs written in Java are wildly slow…
My favourite IDE to use today is IntelliJ, and I prefer it over my experience
with Visual Studio (though to be fair, I did not use VS intensively in the
past 3-4 years).
I don't experience IntelliJ as "slow". It launches faster than VS did when I
used it, and once it is running I keep it open pretty much the entire work-
week without any issues.
~~~
pjmlp
Other than Netbeans and Eclipse being faster and don't turn my dual core into
airplane mode like Android Studio does, which forced me to enable laptop mode
on it.
~~~
Insanity
I don't really understand what you are trying to say, sorry :/
------
iainmerrick
"Fuchsia" and "magenta" are pretty gutsy names to choose, given how similar it
sound to Apple's vaporware "Pink" OS from the 90s (AKA Taligent, AKA Copland).
Somebody has a sense of humor!
It's really hard to tell if this is actually something that will ship, or yet
another Google boondoggle to be swiftly discarded (like the first attempt at
ChromeOS for tablets). Google under Larry Page built and discarded a lot of
stuff; I wonder if it's the same under Sundar Pichai.
[https://en.wikipedia.org/wiki/Taligent](https://en.wikipedia.org/wiki/Taligent)
~~~
JustSomeNobody
Sounds like a stretch having to go all the way back to the 90's to get a
similar color code name.
~~~
iainmerrick
It was the first thing those unusual names made me think of. But I'm a long-
time Mac developer, so probably pink and purple colors as OS names won't have
the same connotations for other people.
------
camdenlock
This could be the first time Apple needs to truly worry about Google. The one
massive lead Apple still has over Google (and the other major players) is the
incredible OS they inherited back in 1997 and continue to extend and maintain
today.
Neither Android nor Windows nor Chrome OS nor your favorite Linux distro have
ever been able to truly compete with the NeXT legacy as it lives on in Apple.
Google is smart enough as a whole to see this, and so it's not surprising that
they're attempting to shore up their platform's competence in this particular
area. What IS surprising is that it has taken them this long.
Perhaps what's truly surprising is just how much mileage Apple has gotten out
of NeXT. It's astounding, and I know Apple realizes this, but I question
whether or not they know how to take the next step, whatever that may be. And
if Google manages to finally catch up...
~~~
jcranmer
> Neither Android nor Windows nor Chrome OS nor your favorite Linux distro
> have ever been able to truly compete with the NeXT legacy as it lives on in
> Apple.
I find this a funny statement. Apple has not seen runaway success in terms of
market share, not on desktop platforms (where the top OSes are various
versions of Windows), not on mobile platforms (where it is a distant second to
Android in the worldwide market), not on server or supercomputer platforms
(where it's effectively nonexistent).
Nor is it influential in terms of operating system paradigms. The only thing I
can see people citing as a Darwin innovation is libdispatch. Solaris, for
example, introduced ZFS and DTrace, as well as adopting containers well before
most other OSes did (although FreeBSD is I think the first OS to create the
concept with BSD jails)--note that Darwin still lacks an analogue.
~~~
glasz
it's not about market share. it's about profit share. android/ios may be 80/20
on market. but they are 20/80 on profit.
market share won't feed nobody. that's all apple needs to care about. just
look at their market cap and p/e ratio.
------
wapz
I'm a minority I know but I don't like material design because it's terrible
at "scaling." It looks great, it's beautiful, but you lose too much damn
functionality. When I had to redo apps to material design we had to completely
remove multiple buttons due to them not fitting material design standards. I
really hope they have some way to alleviate this problem without using 50px
icons for all the extra buttons.
~~~
manmal
Why not bend the rules a bit before omitting vital components?
~~~
wapz
They weren't _vital_ components but useful for the user. We moved most of the
"excess" buttons to the top bar and overflow menu but still had to remove a
button here and there completely (we still had the functionality in a
different part of the app it was just more tedious to use from our testing).
------
Nypro
So Google is going with a DartVM on this one. Dart is cool and all, but why
DartVM? It's the same restrictive model we have with Android (dalvikVM) where
you can only develop with languages that can compile down to Java bytecode. In
this case, however, we will be using languages that can transpile to Dart
source instead! Why not JavaScript engine? With the current movement with
WebAssembly, I see a lot potential use cases. The biggest point being the
ability to code in any language that compiles to wasm. The engine could be
exposed to communicate with the OS directly or sth. If they are going to
consider V8 alongside DartVM, then that would be cool. I truly hope they don't
repeat old mistakes.
~~~
thwd
Dart can compile to JS, so that edge is covered. Dart bytecode is arguably an
easier target for other language compilers than, say, assembly. The DartVM's
byte code is fully specified, there is some adoption, it's openly accessible,
and there's a bold, production-ready reference implementation. For wasm, only
half of that is true.
~~~
sp332
Compiling Dart to JS is a very different problem than compiling JS or other
languages to Dart.
~~~
thwd
I understand what you mean and agree. But then, you don't need to compile JS
into Dart. Rather, JS into DartVM byte code.
------
endorphone
Conjecture is fun, but the linked piece takes some enormous liberties with
crossing massive chasms effortlessly. Not only is Fushia _not_ Andromeda (a
project), the needs of IoT is massively different from the needs of Android.
And the net investment in Android is absolutely colossal, and making some new
APIs or a microkernel does not a replacement make.
------
sametmax
Google could have taken firefox and improve it to make it better, but they
created something new.
Now instead of improving the linux stack and the gnu stack (the kernel,
wayland, the buses, the drivers), they rewrite everything.
They put millions into this. Imagine what could have been done with it on
existing software.
They say they are good citizen in the FOSS world, but eventually they just use
the label to promote their product. They don't want free software, they want
their software, that they control, and let you freely work on it.
~~~
lossolo
Isn't Google paying to Mozilla for Google searches from inside search bar in
Firefox ? Isn't like main money flow for Mozilla? More people use Chrome, more
money Google saves. It's all about the data, making Firefox better didn't
benefit Google as good as creating new browser. Now they do not have to pay so
much for all their users searches to other companies and they have so much
more data that they can use internally for other products.
~~~
teddyh
No, that ended in 2014. Google no longer gives Mozilla anything.
------
mcguire
" _The pitch will clearly be that developers can write a Flutter app once and
have it run on Andromeda, Android, and iOS with minimal extra work, in
theory._ "
How's that going to work? iOS, specifically? Is Dart a supported language?
~~~
afsina
This may answer your question:
[https://flutter.io/faq/#how-does-flutter-run-my-code-on-
ios](https://flutter.io/faq/#how-does-flutter-run-my-code-on-ios)
and
[https://flutter.io/faq/#can-i-interop-with-my-mobile-
platfor...](https://flutter.io/faq/#can-i-interop-with-my-mobile-platforms-
default-programming-language)
~~~
mcguire
From the first link:
" _The engine’s C /C++ code is compiled with LLVM, and any Dart code is AOT-
compiled into native code. The app runs using the native instruction set (no
interpreter is involved)._"
Thanks!
------
brianon99
Google is just afraid of GPL I think.
~~~
ekianjo
So their strategy is to go full-blown closed source?
~~~
sreenadh
Open sourcing the product will help the competition to catch up quickly, all
they have to do is take googles product and change its look. The next half is
infrastructure, which companies like Microsoft, facebook, amazon, alibaba all
have. Plus services like AWS will help future version of dropbox & netflix.
A good example of that is Visual code. I am sure some at github (atom's
paprent) is pissed.
~~~
jryan49
Wikipedia says it's not a fork of atom but based on electron. Does that make a
difference?
------
dep_b
So in the near future billions of devices will no longer be running Linux
anymore? That would be quite a blow to the OS in terms of chances of
dominating operating systems that are being used by end users. Or will they
simply fork it and strip it down until only the parts they really like will
remain?
~~~
atoponce
This is also a concern of mine. What will this mean for rooting devices? Will
it still be root, or will it be "root" as it an iOS jailbreak?
------
yen223
Rust as a first-class language?
~~~
mastax
I believe the article over-sells Fuchsia's use of Rust. Raph Levien wrote some
bindings to the OS runtime, and he does work at Google, but his Rust work is
not-official.
(or that's the story as I remember it)
~~~
sametmax
Which is a shame, they are missing an opportunity to ditch C/C++ in favor of a
safer language and set a precedent in OS history.
Imagine how easier contribution would be if you could write the OS parts with
less lines, guarantied to no introduce most security and concurrency bugs we
know about.
------
Flenser
ANDROid + chROME + DArt = ANDROMEDA?
------
skybrian
re: "Flutter was [...]"
A bit weird to use the past tense here since it's not reached 1.0 yet. You can
try it out today (tech preview) to create apps in Dart that run on Android and
iOS:
[https://flutter.io/](https://flutter.io/)
(Googler, not on the Flutter team itself, but working on related developer
tools.)
~~~
techenthusiast
Fair point! Just fixed that, thanks. I had only meant that Flutter was not
originally intended for Andromeda, as far as I can tell :)
------
bsaul
Trying to see the other side of the coin : what economical reason is there for
this project ?
A company the side of Google, with all its internal politics, doesn't work as
a startup. Starting a third operating system project and hoping it to replace
two major ones means convincing people inside the company to loose part of
their influence. Now it may happen if chrome or android were failing, but
they're clearly not.
------
techenthusiast
I updated the article with the following clarification at the top:
I use Andromeda equivalently with Fuchsia in this article. Andromeda could
refer to combining Android and Chrome OS in general, but it's all the same OS
for phones, laptops, etc. - Fuchsia. I make no claims about what the final
marketing names will be. Andromeda could simply be the first version of
Fuchsia, and conveniently starts with "A." Google could also market the PC OS
with a different name than for the mobile OS, or any number of alternatives. I
have no idea. We'll see.
------
mankash666
I hope the userland is POSIX/Linux compliant. There's a TON of useful software
reliant on this compliance that will go to waste if it isn't compliant out-of-
the-box.
~~~
pjmlp
It doesn't seem to affect ChromeOS, iOS or Android.
------
shams93
As a sound/music app person the inclusion of ASIO for audio is exciting,
Google's new is should be on par with iOS for sound with ASIO audio drivers at
the core.
------
lenkite
AFAIK the flutter UI framework is a react-like framework written in Dart (with
C++ as OS glue) including the UI -> graphics rendering layer. It builds upon
Skia and Blink. I am not sure how that will allow compatibility with other
languages. The only language for UI apps looks to be Dart. Which isn't bad -
its a pretty well designed language, but I don't see how apps can be written
in a wide variety of language as the author suggests.
------
akmittal
>the main UI API is based on, yes, Dart
Won't the Dart's single thread nature be bad to take advantage of Murli core
processors? Or they are embracing web workers?
~~~
bitmapbrother
Dart, in the context of Fuchsia, isn't really a web based language. So yes,
it'll take advantage of multi-core processors.
------
bitmapbrother
The author calls it Andromeda OS, but is this really the Andromeda OS we've
been hearing about? I'm not so sure about that. What we do know right now is
that the OS is currently code named Fuchsia.
Fuchsia repository:
[https://fuchsia.googlesource.com/?format=HTML](https://fuchsia.googlesource.com/?format=HTML)
~~~
techenthusiast
There has been other reporting about this going back to last fall. I don't
think Fuchsia is the marketing name.
------
antoncohen
Link to the source code:
[https://fuchsia.googlesource.com/](https://fuchsia.googlesource.com/)
------
MichaelMoser123
The article says it's a microkernel, I wonder if it will be a more secure
general purpose OS, well windows NT started as microkernel but they changed
that wit NT 4,let's see if it will be different. I also wonder about driver
support and battery consumption. Good luck to Google.
------
sjtgraham
> The pitch will clearly be that developers can write a Flutter app once and
> have it run on Andromeda, Android, and iOS with minimal extra work, in
> theory.
This is worrying for Apple. I can see the following playing out
\- Apple continues releasing machines like the TB MBP, much to exasperated
developer dismay.
\- Other x86 laptop industrial design and build quality continue to improve.
\- Fuchsia/Andromeda itself becomes a compelling development environment
\- Developers begin switching away from Mac OS to Fuchsia, Linux and Windows
\- Google delivers on the promise of a WORA runtime and the biggest objective
reason not to abandon Mac OS, i.e. writing apps for iOS, disappears.
\- Apps start to look the same on iOS and Android. iOS becomes less
compelling.
\- iOS devices sales begin to hurt.
Granted that the App Store submission requires Mac OS (Application Loader) and
the license agreement requires you only use Apple software to submit apps to
the App Store and not write your own, but it seems flimsy to rely on that.
------
conradev
Here is a link to the documentation:
[https://fuchsia.googlesource.com/magenta/+/master/docs](https://fuchsia.googlesource.com/magenta/+/master/docs)
------
avmich
I didn't see in the article explanations why those decisions were taken and
not others. On the surface it feels like this is an OS insufficiently
different from others to justify switching to.
------
pier25
We definitely need a universal OS for all our devices and I really believe
Google is in a great position to get us there.
It would really surprise me if Apple got there first. Tim lacks vision and
will keep on milking iOS even if the iPad Pro is a failure as a laptop
replacement.
Windows is still king in the desktop space, at least as far a user base goes,
but it's terrible on tablets and phones. MS has all the tech in place with
UWP, but it's still pretty far in the race in terms of simplicity and
usability.
Chrome OS ticks all the right boxes, and is experiencing a huge growth, but
it's not universal. If Andromeda is real, and it's able to become a universal
OS that merges Chrome OS and Android it might be the best thing since sliced
bread.
~~~
thewhitetulip
Yes, we do. I want my phone and my laptop to be in total sync, I want to be
able to write code on my mobile which I can just continue on my laptop without
any hindrance, currently I have a mac and an android phone, I do have Go and
Python installed on my mobile, but it isn't that great to code on my mobile, I
have to host the repo on an internal version of gogs to get the code synced up
and I still have to manually push the code around.
All hail Universal OS!!
~~~
vlunkr
You could always develop remotely, use your phone to ssh into a more powerful
machine. Use tmux or screen and pick up right where you were on a laptop or
desktop. This is far more compelling IMHO.
~~~
mcguire
Tmux or screen? Wouldn't it be nice if there were a graphics-over-the-network
system?
~~~
lliamander
I remember the glee of the first time I got an X-Window application to run
over the network. I was so confused though because the "x-server" is the
software you run on your client machine.
~~~
mcguire
You and the _Unix Hater 's Handbook_ authors.
There is a small terminology issue here: a "server" is a program that offers
services to remote "client" programs. The clients make requests and the server
responds to them. A client program will make a request like "allocate me a
chunk of the screen and put these here bits in it", or "let me know about any
of these events that happen". The server manages the screen and notifies the
clients about things they're interested in.
_IT MAKES PERFECT SENSE, DAMMIT!_
~~~
lliamander
This made me laugh (in a good way).
I agree, it actually does make total sense - but that doesn't mean I won't get
confused :).
My only prior exposure to "GUIs over the network" were web applications, where
the roles are essentially reversed. That is, the part responsible for
accepting user input and rendering the UI is the client (the browser), and the
part that performs the application logic is the server.
I naively assumed that X would work the same way, but it wasn't too hard to
unlearn that misconception.
------
gillianlish
guys i hate to tell you this, but it's Feb 15 here in New Zealand, and Google
has cancelled Andromeda.
~~~
solidsnack9000
[http://platypusplatypus.com/news/google-andromeda-isnt-
happe...](http://platypusplatypus.com/news/google-andromeda-isnt-happening/)
------
ungzd
I think it's intended for entertainment and content consumption just like
Android and Chrome OS. And Apple is trying the same with merging iOS and
desktop. How long it will take until all computers will be set-top boxes where
you can only netflix and chill and if you want, for example, to draw
something, you have to buy Professional Grade Computer for $50000?
------
am185
wow! it supports Golang, since it has glsl, this will have nice UI.
[https://fuchsia-
review.googlesource.com/#/q/project:third_pa...](https://fuchsia-
review.googlesource.com/#/q/project:third_party/go)
------
kzrdude
Does anyone have a more detailed explanation of the component called
“modular”?
------
xtat
Honestly I'd rather keep the linux and ditch the JVM
------
BrailleHunting
NIH syndrome plus large organization people looking for job security, and
rough, fat FLOSS full of maintenance and security vulns hell equals "Emperor's
new clothes."
------
HiroP55
The comment about Java based IDEs being slow is not entirely objective and
fact based. I'd say it's more of an emotional argument.
------
em3rgent0rdr
[https://imgs.xkcd.com/comics/standards.png](https://imgs.xkcd.com/comics/standards.png)
------
diebir
Quote: "I am not a programmer, so if anything stated above is incorrect,
please, please send me corrections. I would also greatly appreciate any
clarifying comments, which were one of my primary motivations for writing this
piece." Essentially a bunch of nonsense, in other words.
------
chatman
Microsoft is opening up more and more, and Google is closing down more and
more.
------
mtgx
It would be a real shame if Google wasted this once in a decade or perhaps
once in multiple decades opportunity to not have an OS written in a language
other than C++.
Also, it would be mind-boggling if they didn't actually fix the update problem
this time, and if it wasn't a top 3 priority for the new OS.
~~~
frozenport
It's written in C.
There is little beyond syntax that a different language can offer because a
modern OS cannot afford features like garbage collection. Indeed, this was one
of the research aims of MS's Singularity project.
~~~
sametmax
They could have written it in Rust. No garbage collection, more security
guaranties. Easier to contribute to the code properly.
~~~
frozenport
Rust performed 3x slower and hacking around the language made it somewhat of a
mess [1]. Much like Singularity, this is hardly a success story. Although
Singularity was interesting from a research perspective nobody doubted that an
OS could be written in Rust.
[https://scialex.github.io/reenix.pdf](https://scialex.github.io/reenix.pdf)
~~~
steveklabnik
That paper is very old, before Rust 1.0. There was also a lot of discussions
about ways that they could have used Rust better at the time, IIRC.
Today, there is no reason Rust should ever be 3x slower, especially in an
OSdev context, where you currently have to use nightly.
| {
"pile_set_name": "HackerNews"
} |
Andrew Fluegelman: PC-Talk and Beyond (1985) - ingve
https://medium.com/@harrymccracken/the-1985-andrew-fluegelman-interview-5791470819db
======
gjvc
I liked this quote from the interview:
"The amount of code that’s doing the work is tiny compared to the amount of
code that’s just making life pleasant for you." \-- Andrew Fluegelman
| {
"pile_set_name": "HackerNews"
} |
DigitalOcean - dayanruben
https://www.digitalocean.com/company/blog/ready-set-hacktoberfest/
======
runamok
The title should be "Ready, Set, Hacktoberfest!".
| {
"pile_set_name": "HackerNews"
} |
The Most Common Error in Media Coverage of the Google Memo - mimbs
https://www.theatlantic.com/politics/archive/2017/08/the-most-common-error-in-coverage-of-the-google-memo/536181/?single_page=true
======
Chardok
Regardless on your opinions of the memo, this article nails it right on the
head; Gizmodo and other major news outlets handled this _very_ irresponsibly,
posting their version that had no citations, and leading the reader, even at
the very beginning, into forming the opinion that this was simply a guy being
"anti-diversity".
Whatever your thoughts on the subject are, it needs to be pointed out that
this type of journalism is absolutely _not_ neutral (even though they will
swear up and down that they are) and should be, at the very least, condemned
for doing so. This is and will be an increasingly difficult problem,
especially when people just read a headline and a summary.
~~~
wcummings
This article doesn't say anything about citations, unless I'm missing
something. I've read the memo, "citations" (some of them hardly qualify) and
all. "Anti-diversity" might be a little hyperbolic, but it's hard to argue it
is anything else _in good faith_.
The author feels comfortable making the claim _women are biologically more
neurotic_ , in a _professional setting_ , no less. His "citation" for this
fact is _Wikipedia_. This is unimaginative sexist bullshit that everyone has
heard before. That people feel the impulse to defend this is disgusting.
Whatever innate differences there are between men and women are small and
nowhere near as big a factor in the gender-ratio as say, arrogant weirdos
sending credos to their entire company declaring women unfit to work in their
industry.
~~~
mpweiher
> The author feels comfortable making the claim women are biologically more
> neurotic,
Nope. "Neuroticism" is a technical term, a big-five personality trait.
Everyone has it to some degree and having less is not necessarily better, just
like all the other traits.
What you are getting all riled about is "neurotic", which is related to a
psychiatric condition (although the term is apparently no longer used – who
knew?)
> declaring women unfit to work
Yeah, he didn't do that.
Please calm down and read the actual text. If things seem weird or rile-
worthy, maybe ask first. Or look.
I did a little writeup, [http://blog.metaobject.com/2017/08/the-science-
behind-manife...](http://blog.metaobject.com/2017/08/the-science-behind-
manifesto.html) there are probably others that are better.
EDIT: language
~~~
wcummings
I know what neuroticism is.
~~~
mpweiher
Fantastic!
So why did you then use "neurotic" and get all riled up about it when the
document says "neuroticism", which is (a) nothing to get riled up about and
(b) a simple scientific fact (as best we know)?
------
agentgt
What pains me is before reading this article I was slightly biased because of
previous articles.
Actually it doesn't pain me... it really pisses me off that so many journalist
are fooling people... particularly me.
When I read the bloomberg article (which I submitted to HN and now I want to
just bang my head on the table for doing it) I was actually slightly siding
with Google. Even though I was constantly telling myself "lets see the memo
before judging" I could feel myself making a biased assumption.
I'm so annoyed with myself.
~~~
dvfjsdhgfv
I feel the same. When I first read the censored Gizmodo version, I thought to
myself, "Whoah, this guy is making up too many assumptions without any
references!" Later, when I found the original version with references, I felt
a bit stupid for judging him.
~~~
mpweiher
For me it was the other way around. I was aware of the references, so I just
nodded and moved on, not realizing how incendiary some of these things sound
when you don't have the context.
For example, "neuroticism" being just a technical term for a normally varying
personality trait, not "mental illness".
~~~
novembermike
Yeah, I was surprised to see that Neuroticism was a technical term for a
phenomenon with a well studied gender gap. If you don't know that it sounds
pretty mean.
------
alexandercrohde
This is an incredibly well-written article. I wish I had the emotional
distance and mastery of english express myself with such grace.
Unfortunately for me (and everyone) it takes me a lot longer to find the exact
words for my frustrations in a situation like this. So I end up wanting to say
inflammatory accusations like "PC-group-think witch-hunt," which captures my
anger but doesn't really convince anybody on the other side (but rather
escalates the tension).
The author cleverly brings both sides together by picking a starting point we
all agree "Accuracy in journalism matters" and dissecting how that value was
compromised [in this particular case] in order to promote another value:
"Diversity matters."
Paul Graham describes such a technique in his seminal essay.
[[http://www.paulgraham.com/say.html](http://www.paulgraham.com/say.html)
One way to do this is to ratchet the debate up one level of abstraction. If
you argue against censorship in general, you can avoid being accused of
whatever heresy is contained in the book or film that someone is trying to
censor. You can attack labels with meta-labels: labels that refer to the use
of labels to prevent discussion. The spread of the term "political
correctness" meant the beginning of the end of political correctness, because
it enabled one to attack the phenomenon as a whole without being accused of
any of the specific heresies it sought to suppress.
]
~~~
agarden
Sometimes I think Conor Friedersdorf is my hero. He regularly turns out
articles on prickly topics with a cool, compassionate tone.
------
meri_dian
>"Donald Trump campaigned on the promise of more jobs for working-class
Americans. In service of that end, he has proposed canceling free-trade
agreements, building a wall to keep out immigrants, and eliminating lots of
environmental regulations. Critics who avow that they favor more jobs for the
working class, but oppose achieving more jobs through those specific means,
are not described as “anti-job,” especially when they suggest specific
alternatives for job-creation. Even if their alternatives would result in
fewer jobs than the Trump administration’s plans, that still wouldn’t make a
writeup of their proposal “an anti-job memo.”"
Great point. While reporting on this memo isn't 'fake news', it is an example
of reactionary, knee jerk, click bait journalism, which is a pernicious
problem that stifles nuanced debate, and is probably doing more damage to our
society than literal 'fake news'.
~~~
weberc2
Yeah, when I read through the HN comments from the Bloomberg post, I couldn't
believe how many people spoke about this guy as though he was an actual Nazi.
I can't imagine anyone honestly reading that article and coming to that
conclusion. It's almost like once someone signals that this guy is The Bad
Guy, reason and individual thought go out the window. Of course, I imagine
those folks would disagree with my characterization (and probably make all
manner of insinuations about my character or allegiances), which is admittedly
not very charitable. That said, there are only so many times you can watch the
same pattern of events unfold...
EDIT: In case people don't know what I mean by "pattern of events", I'm
thinking of cases like the Christakis/Yale fiasco where people make
reasonable, respectable, constructive critiques of progressive values and are
met with popular outrage.
------
dvfjsdhgfv
Seriously, it sounds like the author of this article is the first journalist
who actually read the article in full and reflected on its contents.
Unfortunately, it's a few days too late - the rest already spread wrong
summaries, the misinformation has been spread, and the author of the memo
fired.
~~~
kinkrtyavimoodh
I think you are giving all the click-bait writers far too much of a benefit of
doubt. Let's dispel once and for all with this fiction that they don't know
what they are doing.
They know exactly what they are doing, and what they are doing is creating
outrage and profiting from it in this eyeball-click-ad driven journalism
economy.
Very few are going to have the wherewithal to forego hundreds or thousands of
dollars in ad revenues for the sake of a slightly more gently or fairly worded
headline or article. The incentives just do not match up.
~~~
strictnein
Yep, and sadly even major news orgs like CNN are doing this now. Case in
point:
"Google CEO cuts vacation short to address controversial memo that argued
women aren't biologically fit for tech jobs "
[https://twitter.com/CNN/status/894904419766108161](https://twitter.com/CNN/status/894904419766108161)
"A Google engineer argued that women aren't biologically fit for tech jobs."
[https://twitter.com/CNN/status/894951392779141120](https://twitter.com/CNN/status/894951392779141120)
~~~
Danihan
This is intentional. It's intended to capture attention from people who will
feel a need to argue that the headline misrepresents the content of the memo
(which is does).
Then CNN converts that emotionalism and clicks and comments into ad dollars.
It's exactly why the media LOVES any controversial "wedge" issues. They
polarize people, and polarized people argue incessantly. And people who are
arguing == views and ad dollars.
~~~
tajen
Or is it intentional by the leaders of CNN to try to side with Google on that
case?
------
matchu
It's important to distinguish between what the memo's author says, and what
effect his words actually have. It _is_ an anti-diversity memo, even if it
isn't intended as one.
The author makes shaky statements about gender, reinforcing sexist
stereotypes. The author applies rationalist disclaimers, which enables
already-sexist readers to feel that their sexism is rational. And, most
distressingly, the author asserts that Google made a mistake hiring many of
the women who work there. Actively making your minority coworkers feel
unwelcome is an anti-diversity behavior, and it was an obvious and predictable
consequence of how he chose to communicate.
I don't claim to know the author's intent, or how he truly feels about the
women he works with. But, regardless of whether he's actually opposed to
diversity, we judge words by their consequences. These words are thoroughly
anti-diversity in consequence, and judging them in a vacuum is dangerously
naive.
~~~
Danihan
Define sexism...
Is believing men and women on average have different hormone levels and
generally speaking, this leads to different behaviors and proclivities,
sexist?
Is admitting there is _any_ difference between the two sexes, sexist?
Is using different pronouns for men and women sexist?
I honestly don't know where someone draws the line who finds this memo
"sexist."
~~~
matchu
I haven't decided yet whether I think the memo is sexist. But I'm confident
that, because of how it's written, sexist people who read it will feel
validated in their sexism.
It uses the same core argument as sexism: women are less suited to certain
tasks, perhaps biologically. And it reaches the same conclusion: we should
roll back our pro-diversity and pro-empathy programs. A sexist person who
reads this will therefore feel that it supports their views, and, because the
argument seems rationalist, they'll conclude that their poor treatment of
women is rationalist. That might not be the intent of the document, but it
_is_ a predictable outcome.
Words that validate sexist behavior, intentionally or unintentionally,
contribute to the problem. Regardless of the merit of the underlying idea, or
the valuable conversations it inspired, it's important to remember that the
memo itself did harm. It's appropriate that some people are focusing on that.
~~~
slavak
This idea of avoiding saying something because of how horrible people might
choose to interpret it is something I find, frankly, terrifying. This is going
beyond just censorship and going into the realm of trying to censor reality.
Where do you draw the line on something like this? Are we allowed to publish
statistics that show black people are proportionally more involved in crimes,
or is this taboo because a white supremacist might use it to claim blacks are
inherently criminal? What if you write something apparently neutral but some
terrible person somehow finds a way to twist it to their ends? Do we get to
condemn you ex-post-facto over this?
~~~
matchu
I'm not saying _don 't_ have these conversations. Rather, have them carefully,
and choose your words with the consequences in mind. There are many good and
thoughtful ways to talk about potential issues with Google's gender diversity
programs, but instead this memo made some _especially_ bad choices.
For one thing, the memo focuses on needlessly contentious issues, instead of
sticking to actionable arguments. It's valid to say that decreasing stress in
engineering and leadership positions might attract more women, because modern
women tend that value that more. But framing it as a _biological_ issue is
hard to prove, and doesn't help support his logistical point. It _only_ has
the consequence of hurting people.
The memo also presumes that Google's full-time diversity experts haven't even
thought of his concerns. He asserts that seeking out women necessarily lowers
the hiring bar for them, instead of asking "How are we mitigating the risk
that our pro-diversity push might _itself_ introduce bias into our ideally
gender-agnostic perf evaluations?" That's a valid question, and I'm sure
Google's diversity team has answers, and I'm sure that some people wouldn't be
satisfied with those answers. But jumping to the conclusion that Google's
women must be less qualified than the men, just because _he_ can't think of a
way to mitigate bias in the hiring pipeline, is self-centered and
disrespectful.
I'm very much in favor of a world where it's equally okay to express all
ideas! But that doesn't mean we should be equally okay with all modes of
expression. No matter which side we're on, we need to think first, then speak.
Given the meta-thesis of the memo (especially the "prioritize intent"
section), I'm not convinced that the author took much time to consider needs
beyond his own.
~~~
slavak
> It's valid to say that decreasing stress in engineering and leadership
> positions might attract more women, because modern women tend that value
> that more. But framing it as a biological issue is hard to prove, and
> doesn't help support his logistical point. It only has the consequence of
> hurting people.
What it sounds like you're saying is that saying women are on average more
sensitive to stress based on extensive scientific research which implies a
strong biological basis, is contentious and hurtful. But then for some reason
saying modern women tend to put more value on a stress-free environment, based
on nothing but an unsupported assertion, is somehow better?
I don't have a crystal ball, but I suspect you're being naive and that the
outrage would have been much the same no matter how he'd chosen to frame this
statement. The very assertion that men and women have some innate differences
that might be worth exploring seems to be tantamount to blasphemy --
particularly when coming from a man!
> The memo also presumes that Google's full-time diversity experts haven't
> even thought of his concerns. He asserts that seeking out women necessarily
> lowers the hiring bar for them, instead of asking "How are we mitigating the
> risk that our pro-diversity push might itself introduce bias into our
> ideally gender-agnostic perf evaluations?" That's a valid question, and I'm
> sure Google's diversity team has answers, and I'm sure that some people
> wouldn't be satisfied with those answers. But jumping to the conclusion that
> Google's women must be less qualified than the men, just because he can't
> think of a way to mitigate bias in the hiring pipeline, is self-centered and
> disrespectful.
This is just you projecting your presumed intentions on the author. At no
point in the memo did he claim or imply that Google's women are less qualified
than the men. The only paragraph that can really be taken to say that is the
part about "lowering the bar" for diversity candidates; Which is, admittedly,
an unfortunate choice of words in retrospect. However the same sentence
clarifies that the bar is "lowered" by decreasing the false-negative rate for
diversity candidates, meaning those that are accepted are still qualified at
the same standards. The sentence also includes a reference for this claim, but
this is unfortunately to an internal Google group so we don't know its
contents.
On the other hand, right at the start of the document the author takes pains
(including a big colorful picture to illustrate the point) to point out that
"you can’t say anything about an individual given these population level
distributions," which should make it pretty clear that he's NOT claiming
Google's female engineers are less qualified.
> I'm very much in favor of a world where it's equally okay to express all
> ideas! But that doesn't mean we should be equally okay with all modes of
> expression. No matter which side we're on, we need to think first, then
> speak. Given the meta-thesis of the memo (especially the "prioritize intent"
> section), I'm not convinced that the author took much time to consider needs
> beyond his own.
This is saying that one must choose his words like a politician and consider
the reaction of the world at large when distributing a personal opinion
document not intended for wide publication to a select group of individuals.
The idea that one's career might hinge on using the proper newspeak in such a
document is, frankly, terrifying to me. Do people have a right to be upset
about his choice of wording or angry at his opinions? Sure, absolutely! But
losing your career for this, over an opinion that is, arguably, not really
harmful or hateful and expressed in a relatively considerate tone, is
something else entirely.
~~~
matchu
Mm, thanks for calling out the false-negative thing! I think I misparsed that
the first time around and got confused between decreasing false negatives and
increasing false positives. That's embarrassing, sorry ^_^`
In any case, I think I made a mistake suggesting specific improvements to the
memo; lemme pop off the stack a bit:
It's not okay to publish a document to your coworkers that will predictably
make them feel unsafe. Full stop.
When you want to express an idea at work, you need to engage in empathy, and
try to express yourself in such a way that your coworkers will still feel safe
with you. If you can't figure out how to express an idea without hurting your
coworkers, then, yeah, you don't get to express it unless you figure something
out :/ That's an appropriate workplace policy, and I'm comfortable with the
general idea that freedom of expression is subject to some conditions. I know
not everybody agrees with that prioritization, though!
More importantly, I'm just tired of articles like this one dismissing the
social consequences lens outright. There's more than one valid issue being
raised in our community right now, and the importance of one doesn't
invalidate the others. Let's have both conversations: how to enable expression
of less common ideas, and how to ensure that we express them empathetically.
If we approach the problem thoughtfully, I think we can optimize for both :)
(BTW I edited this comment a lot during the first 30 minutes, and pretty
significantly changed its contents. Sorry if that ends up being an issue!)
~~~
slavak
I actually tend to mostly agree with you on this. I think the safest and most
rational policy is just to avoid discussing sensitive topics at work so as not
to risk creating a hostile atmosphere, and I don't consider this an
unreasonable restriction on freedom of expression. Talk politics and
immigration with your friends and family, not your teammates at the office.
My problem is that Google as a company, at least as far as the Mountain View
campus goes, apparently disagrees. My understanding -- and it's possible I'm
wrong -- is that Google supports and encourages openly discussing a variety of
topics at work, and the internal tool he used to publish his memo was designed
and used exactly for this purpose. (What Googlers apparently describe as "an
internal-only Reddit.") If this is true then he was fired not for discussing
inappropriate topics, but for holding opinions the hive-mind finds
disagreeable.
Either you as a company support discussing sensitive topics in the office, or
you don't. If you don't that should be made clear and enforced equally for
everyone. If you do then you can't pick and choose which opinions you approve
of based on what's popular, and expressing a dissenting opinion should not, at
the very least, be a fireable offense!
------
thowaway26539
It would have been much more fair to call it an "anti affirmative-action
memo". Framing it as an "anti-diversity screed" is a pretty biased move, not
to mention how they removed his supporting content as well. Certainly there
are many who hold the opinion that "anti affirmative-action === anti
diversity", which is a point worth debating separately, but I still find the
framing used by most of the news articles very misleading.
~~~
4bpp
This (conflating opposition to a process or movement for X with opposition to
X) seems to be a common trick in the political discourse nowadays, and is
unfortunately called out very rarely. We didn't hear many instances of "anti-
jobs" as suggested in this article, but it seems like describing a variety of
institutions as "anti-white" has been a right-wing staple since long before
the emergence of well-connected Tumblr and Twitter rubes who could plausibly
be described as such.
------
sandstrom
The author of the memo is basically saying the same thing that got Larry
Summers axed as president of Harvard.
Harvard University President Lawrence Summers [was fired] for
mentioning at a January 14 academic conference the entirely reasonable
theory that innate male-female differences might possibly help explain
why so many mathematics, engineering, and hard-science faculties
remain so heavily male.
Isn't the idea with free speech that you allow people to say things that you
disagree with?
Or as someone else has already phrased it nicely:
"After all, if freedom of speech means anything, it means a willingness
to stand and let people say things with which we disagree,
and which do weary us considerably."
[https://www.theatlantic.com/magazine/archive/2005/02/why-
fem...](https://www.theatlantic.com/magazine/archive/2005/02/why-feminist-
careerists-neutered-larry-summers/303795/)
------
clairity
i appreciate the article for pointing out a nuance often lost in this kind of
situation: that teasing out positions and perspectives requires a careful
reading, and summaries are often (intentionally) misleading.
but let's be clear: the memo was a political document (in the common sense of
the word, rather than about government machinations). sure, james damore may
have been trying to have an honest conversation (and honest discussion should
totally be encouraged), but the guy's biases and position were clear right
from the title onward. he was attempting to assert what he thought was a
superior position and got shot down. now others who (secretly or otherwise)
share some portion of that position feel vulnerable and defensive, and we get
heated discussions driven by primal emotions using otherwise rational-sounding
words. it's politics.
that's what the media is zooming in on, because that's where the charged
emotions are. cynically, yes, that sells papers (or whatever), but less
cynically, that's also where we collectively seem to want more discussion
because the social norm is (potentially) shifting and not at all well-defined
or collectively understood. the media didn't make a mistake so much as it
instinctively cut right to the chase.
~~~
mpweiher
> biases and position were clear right from the title onward
What were those "biases" and "positions", pray tell?
> vulnerable and defensive
You mean those that lied about what the guy wrote and then attacked that made-
up wrongthink?
~~~
bandrami
It's actually about ethics in game journalism?
------
alistproducer2
I'm going to join in on the chorus of people here who are voicing their
displeasure with the way the modern media works. I'm a left winger but I feel
much the same way about the media and the chattering class as the most noxious
parts f the right wing. It's not so much that the media is biased in one
direction or the other; it's that MSM has mostly abrogated it's responsibility
to inform the public. the media, and the class of people that create its
content, see themselves as influencers more than reporters.
As an example, take the performance (and I do mean that literally) of Jim
Acosta when he made a speech disguised as a question to Stephen Miller about
the poem on the Statue of Liberty. Who told Mr. Acosta that what the public
wants from its journalists are speeches instead of substantive questions?
It's often said that politicians want to be movies stars. these days it seems
that the journalists want to be politicians and it's become a problem that
Americans all over the political spectrum are beginning to see.
~~~
wu-ikkyu
>MSM has mostly abrogated it's responsibility to inform the public.
That's a myth.
The only real responsibility they have is to maximize profits for their
shareholders.
~~~
alistproducer2
I dispute that. Most major newspapers in the states have existed long before
it was common for media outlets to be publicly traded companies. The term
"fourth estate" dates back almost to the time of the French Revolution so
while it's always been expected that newspapers need be profitable to exist,
the interests of its readers and those that stand to profit from the paper
have not always been so far apart as they are now.
~~~
wu-ikkyu
It would seem that not much has really changed since then:
"To your request of my opinion of the manner in which a newspaper should be
conducted, so as to be most useful, I should answer, `by restraining it to
true facts & sound principles only.' Yet I fear such a paper would find few
subscribers. It is a melancholy truth, that a suppression of the press could
not more compleatly deprive the nation of it's benefits, than is done by it's
abandoned prostitution to falsehood. _Nothing can now be believed which is
seen in a newspaper._ Truth itself becomes suspicious by being put into that
polluted vehicle. The real extent of this state of misinformation is known
only to those who are in situations to confront facts within their knowlege
with the lies of the day. I really look with commiseration over the great body
of my fellow citizens, who, reading newspapers, live & die in the belief, that
they have known something of what has been passing in the world in their time;
whereas the accounts they have read in newspapers are just as true a history
of any other period of the world as of the present, except that the real names
of the day are affixed to their fables. General facts may indeed be collected
from them, such as that Europe is now at war, that Bonaparte has been a
successful warrior, that he has subjected a great portion of Europe to his
will, &c., &c.; but no details can be relied on. I will add, that the man who
never looks into a newspaper is better informed than he who reads them;
inasmuch as he who knows nothing is nearer to truth than he whose mind is
filled with falsehoods & errors. He who reads nothing will still learn the
great facts, and the details are all false."
-Thomas Jefferson, letter to Thomas Norvell, _June 14th, 1807_
------
dgudkov
All major Canadian media called the memo anti-diversity. So unprofessional and
biased.
[1] [https://www.google.ca/search?q=anti-
diversity+canada+google](https://www.google.ca/search?q=anti-
diversity+canada+google)
------
jasode
_> to help him avoid alienating his audience,_
The gender-diversity topic is too charged to accomplish that. Seriously, I
would challenge essayists from either side of the debate to write any
significant words on the subject that does not "alienate the audience".
------
mhalle
Completely agree that the reporting on this memo would have benefited greatly
by more careful reporting by the journalism community.
While that may be the case, however, it isn't like the uproar and
misinterpretation couldn't have been predicted. Whatever the academic merit of
the memo author's claims, the memo was thrown into a social context that was
clearly primed for snap judgement.
Particularly regarding social issues, how we write is just as important as
what we write. An effective argument connects and convinces, anticipating
possible reactions and (mis-)interpretations of the reader.
The irony is that the memo is missing the necessary empathy and social
awareness for the audience, qualities that the author attributes to women.
------
eli
It's ironic that many of the comments here assume Gizmodo and others were
acting out of malice to intentionally mislead their audience.
~~~
dvfjsdhgfv
I don't know since I can't read their minds, but both options are troubling.
If they did it for easy click-bait money, it's ethically bad. But if they did
it on purpose, this is even worse, because it means they manipulate other
narratives in the same way. If they can convince very smart HN readers, how
totally helpless is the rest of the population?
~~~
eli
Why is assuming journalists are unethical different from assuming Google
engineers are sexist?
~~~
dvfjsdhgfv
That's why I prefer not to assume anything.
------
erikpukinskis
I generally liked the memo, and I'm pretty aggressively pro-affirmative
action. Some of the memo is factually wrong, but I think he tried to be
measured, and I'm proud of him.
The sentence that popped out at me is this one:
"I don’t think we should do arbitrary social engineering of tech just to make
it appealing to equal portions of both men and women. For each of these
changes, we need principled reasons for why it helps Google."
I think it's hard for women to understand this, because they are much more
likely to have an intrinsic understanding of why pro-diversity social
engineering makes sense. As a man, it has often not been obvious to me. I
learned a long time ago to assume that women are right about gender stuff, and
that assumption has done me extremely well. I've learned an incredible amount,
and those acts of goodwill made women much more inclined to be gentle with me
and explain things.
But I don't think institutionalizing that kind of trust is tactically
feasible, and I'm not sure it would even be a good thing if it happened.
Because I think pro-diversity policies will be _strengthened_ , not weakened,
if staff can form a rigorous story about how they help the company.
I believe affirmative action helps Google, so I don't think it will be
impossible to tell that story, but it won't be easy. It will take work. Mostly
because liberal circles don't really talk about it. Diversity is seen as a
benefit to diverse people, and therefore good, 'nuf said, as the the Google VP
quipped.
I think affirmative action is valuable for the reason the anonymous memo
writer thinks it's problematic: because different people are different. I
don't think if 50% of Google coders were women that Google would stay the
same. I think it would become a very different because women have some
differences from men in aggregate.
And so changing admission requirements to help more women get the jobs
shouldn't necessarily be seen as lowering the admissions standards, it should
be seen as changing the set of things that coders are allowed to focus on. And
we should assume that we'll see a whole new influx of a different kind of men
too, men who are more similar to the women in the middle of their bell curve,
than the men at the middle of theirs.
But ideally the way that should happen is not by saying "let's take 50%
women", but by saying "if we were to accept 50% more women, what new kinds of
Googler would we be adding? How will those folks make Google stronger? How can
we change our hiring criteria to find the best of that kind of Googler?" and
yes, those criteria would bring in a lot more women, but they'd also bring in
a smaller number of new men! And everybody involved would have an
understanding of how Google was getting better. Those women would have more
respect. Those men would be better appreciated, even as they were operating
outside of old Google norms.
I will say, just ramming 50% more women into the culture is probably fine
though. While I mostly agree with the memo's general thrust, that it would be
better to do this a different way, I think the alarmism is a little out of
place. It's certainly a problem that conservatives are afraid to speak up
about gender issues, but I doubt that's Google's biggest culture problem right
now.
~~~
agarden
I think the idea you are advocating, that they should change the nature of the
job to make it more accommodating to a more diverse talent pool, is the kind
of thing the memo was advocating. See the section titled "Non-discriminatory
ways to reduce the gender gap".[1] In that section, he suggests making the
work more cooperative instead of competitive and making part-time work first-
class, among other things. He also suggests making the work more
collaborative, which seems a lot like your "changing the things coders are
allowed to focus on." He then goes on to note that doing this will also mean
Google will diversify the kind of male that it gets, just like you do.
Or am I missing some difference in what you are suggesting and what the memo
suggested?
1\. [http://diversitymemo.com/#reduce-gender-
gap](http://diversitymemo.com/#reduce-gender-gap)
~~~
erikpukinskis
Yes, I think our methodology is the same, but the memo seems to expect that
Google will still be <50% women in that scenario, whereas I believe it will be
brought to 50%. Which is why I don't see the quota system as a fundamental
risk to Google, whereas he does.
------
bandrami
I'm not sure why anybody thinks his "intention" or "motivation" are important.
Can somebody who thinks that say more?
~~~
zbyte64
Because one of his recommendations was to put a greater emphasis on intent?
~~~
bandrami
OK, but that's begging the question (to use this phrase correctly, for once).
I _intend_ to write the next generation search engine. Should Google reward me
for that?
------
arca_vorago
What this is all really about, when it get's boiled down, is the testing of
the freedom of speech by using offense to rally the mob for censorship of a
minority. If the oligarchy gets this style of democratic self-censorship past
us, it's one more nail in the coffin of freepeoples everyone.
------
rabboRubble
Apparently, I do not have strong standing from which to comment on this. I'm
prone to neuroticism. My bad. I would have never independently discovered
this about myself without a Google guy to point it out.
Thanks Google dude!
------
Aron
This article is so well written I am left wondering what happens if the tide
completely reverses and it becomes overwhelmingly clear that Google should not
have fired him.
------
quxbar
> To me, the Google memo is an outlier—I cannot remember the last time so many
> outlets and observers mischaracterized so many aspects of a text everyone
> possessed.
How about every article or video about cryptocurrency, PRISM, AI, 'Data
Science' and a litany of other topics in tech? I have seen almost no 'tech'
journalism of any merit, so I'm not surprised to see sloppy coverage of
another complex issue. But that shouldn't stop meaningful HN comment threads
:)
The fact is, the memo does not simply put forth a question for debate, it
treats a massive legacy of misogyny in our culture as a feature, not a bug. He
really genuinely sees no problem with a world that pushes people into gender
roles. In fact, he thinks we should optimize for it. It's a selfish tantrum
thrown by someone feeling a lack of affirmation - disguised as vague argument
that he really understands people, tech, and companies much better than his
bosses.
If you helps for you to remove anything about identity in here: it's as if
someone posted a cruel, snide rant in defense of GOTO statements, attacking OO
programmers. Not only is he wrong, he made an extended case for the wrong
argument and did it in a way that inflict maximum company damage.
So yeah it would be a big ol' red flag.
~~~
josteink
> Not only is he wrong
That's merely your opinion, which clearly can be seen from the comments here,
has disagreement.
> he made an extended case for the wrong argument and did it in a way that
> inflict maximum company damage.
How did he inflict damage on the company?
Whoever _leaked_ this _internal memo_ did the damage, but I haven't seen any
witch hunt or firing in that regard.
Strange, eh?
------
joelrunyon
Why is this on the second page with 246 points in 1-2 hours? Seems strange...
~~~
md224
Not sure if you'll see my reply, but this happens often to HN submissions on
politically charged topics. The topic gets flagged to death. It's been like
this for a long time.
~~~
joelrunyon
Why wouldn't I be able to see your reply?
------
Dowwie
What are the chances anyone at Google read the Memo as carefully as this
author did before persecuting James Damore?
~~~
Diederich
I suspect the contents of the memo had little to do with Google's official
reaction to it.
------
turc1656
Stop the presses. You mean to tell me the major media organizations used a
misleading headline and description for something highly politicized?! I'm
shocked. Shocked, I tell ya.
------
peterwwillis
Language and gender nit-pick: having men and women at 100% parity is not
gender diversity, it is gender-binary. You would have to hire a lot more non-
binary-gendered people for it to be diverse.
~~~
sn9
I honestly wouldn't be surprised if non-binary-gendered people are _over-
represented_ in tech.
(This is just an impression I have based on things I've read in the past, so I
can't be sure.)
------
Marazan
True fact: saying "I'm not a racist but..." makes anything you say afterwards
definitely not racist.
------
vkou
Prefacing an argument with "But before you guys mistakenly think that I'm
racist - remember - I have black friends" is not the most important point in
an argument.
It is, arguably, the least important one. So much so, that it is a non-
sequitur.
| {
"pile_set_name": "HackerNews"
} |
Free Flat UI Kit - AndrewCoyle
http://designedthought.com/thank-you.html
Practice Flat design- download the free user interface kit.
======
wingerlang
I kind of liked the "pay with a tweet" in return for getting the resources for
free. Then again, with that website I have no idea how they look, if I want
them and especially not if I want to redistribute them (via the tweet).
Id prefer to "pay" after I know exactly what I am getting.
------
Gigablah
How is this "free"? There's no site navigation, no download link, no decent
preview, and only one button saying "pay with a tweet". I'm not clicking that,
it reeks of spam.
~~~
AndrewCoyle
I apologize for the confusion. I created this UI kit to build awareness about
my new blog. You can see a better preview on my Behance.
[http://www.behance.net/gallery/Flatter-A-Free-UI-
Kit/1033195...](http://www.behance.net/gallery/Flatter-A-Free-UI-Kit/10331953)
Hope you enjoy it.
~~~
siddboots
The "Download Flatter UI For Free" link on your blog goes straight back to the
page asking me to "Pay with a Tweet".
Is it free, or are we paying for it?
Do I need to get a twitter account?
~~~
AndrewCoyle
The UI kit is free if you tweet about designed thought. You can also "pay"
with facebook, or linkedin
------
wilg
"Pay With A Tweet" sounds like the worst thing in the world.
~~~
AndrewCoyle
Well if it is worth the trouble I would love to share more design resources
with you. I checked out your film work. Good stuff. I am envious of
filmmaker's abilities.
| {
"pile_set_name": "HackerNews"
} |
Illusions of Sound Perception - DavidSJ
http://sethares.engr.wisc.edu/htmlRT/4%20sound_illusions.html
======
Stenzel
There is no illusion of pitch, and it is a common misconception that the
fundamental frequency must be present in a tone. Pitch is the perceived
periodicity of a tone, which is roughly the greatest common divisor of the
harmonics. If perceived pitch without fundamental is considered an auditory
illusion with, common pitch detection techniques should fail if the
fundamental is not present, but they work quite well in the absence of the
fundamental. So either there is no illusion of pitch or algorithms have
illusions too.
~~~
DoctorOetker
the answer is: algorithms have illusions too.
consider 2 ideal harmonic notes with a frequency ratio of 3:2, say 3kHz and
2kHz ... The brain / algorithm must doubt between interpreting the collection
of frequency peaks at m * 3kHz, and n * 2kHz as either (occasionally
overlapping) harmonics of 2 notes at 2kHz and 3kHz, OR it could interpret this
as harmonics of a single note at 1kHz (as you say the GCD of the frequencies).
There is inherent ambiguity between interpreting as 2 notes of each a timbre,
vs interpreting as 1 note with another timbre...
One could physically construct 3 bowed strings with modekilling on the 1kHz
string, such that these could make perceptually identical sounds whether the
2kHz and 3kHz strings are played simultaneously vs the 1kHz string.
at that point from the sound alone one can not discern in an ABX test which is
the case, neither a human brain nor any algorithm. The doubt forces to guess
(deterministically or not).
The sound is a projection of properties occuring in reality, and loses
information.
~~~
Stenzel
True, but ambiguity does not imply that one possible interpretation must
necessarily be an illusion.
~~~
AstralStorm
Tones and harmonics get clustered into pitches, e.g. mistuned harmonics as
seen in bass guitar or piano still get decoded into pitches via some sort of
best match if the mistuning does not exceed certain percentage. And it works
even if some harmonics disappear and reappear.
The pitch is higher level than purely perceptual.
~~~
DoctorOetker
this is correct, and the reason we are tolerant is because of dispersion: even
though the different harmonics are present on the same string of the same
length, the resonant frequencies don't need to be integer multiples of the
fundamantal since waves of different frequency have different propagation
speeds on the string.
in the case of bowed strings mode-locking ensures the phases of all the
harmmonics are reset each cycle (the bow sticks and slips), so that bowed
instruments can be played harmonically to parts per billion.
since a lot of sounds are plucking we must be tolerant for frequency dependent
propagations speeds in regular strings / media
------
dr_dshiv
The Shepard tone illusion, of an ever rising pitch, is used in the movie
Dunkirk
[https://www.businessinsider.com/dunkirk-music-christopher-
no...](https://www.businessinsider.com/dunkirk-music-christopher-nolan-hans-
zimmer-2017-7)
~~~
rrss
And the dark knight films. Apparently Nolan is fond of it.
~~~
dawnerd
And rightly so. It's incredibly effective at building suspense. Hans Zimmer's
incredible scoring also helps.
------
sp332
Here is a video I enjoyed that explains the basics of how our sense of sound
works. It makes it easier to understand why some of the illusions happen.
[https://vimeo.com/147902575](https://vimeo.com/147902575)
~~~
holy_city
Typo in the video (I think, not an anatomist), it's "basilar" membrane, not
"vasilar." Awesome video though, I wish my speech processing professor had
used that instead of teaching hearing like a filterbank, even if that is how
we needed to understand it.
Another weird thing about hearing: the hairs that vibrate aren't just tuned to
particular frequencies, they actually vibrate over a range, and the response
isn't symmetric (although iirc, part of that is from the fact the hairs are
mechanically coupled). That's why low frequency noise masks high frequency
noise more than vice versa, which is exploited in lossy codecs (if there's low
frequency energy, you don't need high frequency energy that it masks).
~~~
sp332
That correction is in the video description, so yeah.
------
tony
For more:
[https://en.wikipedia.org/wiki/Auditory_scene_analysis](https://en.wikipedia.org/wiki/Auditory_scene_analysis)
This book _Auditory Scene Analysis: The Perceptual Organization of Sound_ by
Albert S. Bregman has more.
More foundational info on how we "fill in" information we see / hear
[https://en.wikipedia.org/wiki/Gestalt_psychology#Properties](https://en.wikipedia.org/wiki/Gestalt_psychology#Properties)
------
carapace
See also "sine-wave speech".
[http://www.scholarpedia.org/article/Sine-
wave_speech](http://www.scholarpedia.org/article/Sine-wave_speech)
Oooo! Don't miss the _acoustic chimera_ on that page...
------
anotheryou
That was just one :/
I give you another one related one for rhythm:
[https://youtu.be/oQf_tS5WAP4](https://youtu.be/oQf_tS5WAP4)
| {
"pile_set_name": "HackerNews"
} |
Things I wish I knew the day I started Berklee - gnosis
http://sivers.org/berklee
======
dbrannan
I really like the martial arts saying he uses:
"When you are not practicing, someone else is. When you meet him, he will
win".
I remember years ago when I was on the swim team I had missed two practices.
My coach said I had missed 4 practices, and I tried to correct him but he
said, "You missed 2 practices, but your competition did not. So now you are 4
practices behind your competition."
I always remembered that.
~~~
Psyonic
I want to believe that's insightful but I can't seem to interpret it any way
other than a math fail. I could potentially have 10 practices, miss 2.
Opponent has 10, misses 0. Opponent 10 - Me 8 = 2 missed practices... If the
competition did miss the practices, it'd be a wash, so we'd all have
essentially missed 0.
What am I missing here?
~~~
chrisa
I think this is the rationale: In sports (or anything that requires practice),
going some time without practicing actually makes you lose some of the ability
you once had. So in this case, it would take two practices just to get back
what was lost during the break, at which point he would be 4 practices behind.
~~~
ceejayoz
I took it as just "not only have you not improved, but your opponent improved
while you weren't improving too".
~~~
Psyonic
Right, but that still only accounts for +2. If he hadn't improved, it'd be a
wash. Only regression on your part gives a +4.
------
hasenj
I like the "be valuable" advice. It's what all college students should keep in
mind.
The point of education is not to get a "certificate" that proves to your
future employers that you went through the motions of education.
The point is to make yourself valuable.
I'm surprised how many people are oblivious to this.
So many people view education as nothing more than "something boring you have
to do so that you get a decent job". Where a decent job is "something boring
that you to do to make a decent living".
There's a contradiction there somewhere: if everything you do is boring, how
"decent" is your living? really.
What's their idea of a decent living? "Getting paid enough to pay the bills
and send the kids to school and make them not have to worry about doing any
work". In other words, a decent living is the ability to make your children's
life just as boring as yours is.
None of this brings any happiness.
~~~
mechanical_fish
_I'm surprised how many people are oblivious to this._
There is little in one's formal educational experience to prepare one for the
concept that one can do work that has real value. So much of what you work on
is an exercise, a problem that has already been solved that you must solve
again for a grade or a prize, after which your work will be thrown away.
~~~
gridspy
It is unfortunate how much programming is replicated by hundreds of
businesses, not shared and then thrown away 3-10 years later.
Thank god for open source and the corresponding increase in code sharing.
~~~
mechanical_fish
That's the idea, all right. Fix a bug or add a module in an open source
project, experience the power of actually making a difference.
------
sp4rki
_how much does the world pay people to play video games?_
Actually if you're good enough, plenty of money. It's not a matter of how many
people do it, it's a matter of how much better you are than the many people
that do it are. Amateur programmers shouldn't be making software for Bank of
America, the same way an amateur musician shouldn't be playing for Dream
Theater. The interesting thing is that one generally doesn't notice when you
cross the line that makes you a professional, which is generally delimited by
profitability.
If you can make money with your abilities it's because there are a bunch of
people that can't, but never make the mistake of thinking that because a lot
of people do something it means you can't make money off it. Oh and of course,
the person with such abilities that doesn't take advantage of them to make
money doesn't deserve them (with the exception of the multitalented who
leverages a 'better' skill or the person leveraging those skills in a risk
filled endeavor for larger profits).
~~~
ido
Oh and of course, the person with such abilities
that doesn't take advantage of them to make money
doesn't deserve them (with the exception of the
multitalented who leverages a 'better' skill or
the person leveraging those skills in a risk
filled endeavor for larger profits).
I think I understand what you meant, but I can't completely agree with what
you said - sometimes there are indeed worthy uses of skill for a purpose other
than making money.
~~~
sp4rki
Re-reading the comment I see that I made a mess out of my thoughts. If you're
a business man that knows music hell more power to you right? I meant the
cases where you have people doing amazing things programming, or composing
music, or whatever, but not making any money and having to work in a call
center or a grocery store packing bags. I've seen many people I know make such
retarded choices it's not even funny. Reminds me of Good Will Hunting. It
would have been a complete waste to give the guy such a mind so he can go work
at a construction site right?
~~~
maxawaytoolong
What about Grigori Pereleman?
Is he a complete waste because he declined to take the prize money for his
proofs?
~~~
stoney
Grigori Pereleman would be doing mathematics with or without the prize money.
I think sp4rki's point is that it's a waste if not taking/making the money
means that you have less time to do whatever it is you're good at. E.g. the
artist who could make money from it but chooses not to, and as a result has to
work at the convenience store, and as a result produces less art. Though that
last bit is a bit contentious - maybe working at the convenience store is a
good source of inspiration for that artist.
~~~
maxawaytoolong
_Grigori Pereleman would be doing mathematics with or without the prize
money._
But, according to various accounts he's probably not. He quit doing
mathematics and just lives with his mother and goes to the movies.
[http://en.wikipedia.org/wiki/Grigori_Perelman#Withdrawal_fro...](http://en.wikipedia.org/wiki/Grigori_Perelman#Withdrawal_from_mathematics.3F)
------
RK
I attended music school for a short time after college (not Berklee). After
having done a very tough BS in physics, I found the slow pace of the music
theory classes pretty frustrating. I worked ahead, but not at the pace I
probably could have. I very much agree with his point about not letting others
(i.e. courses) set the pace.
Also, having another degree I think I had a different perspective than many of
the high school graduates that where there with the idea of becoming a rock
star or whatever. Most of the instructors, etc, made their livings by
teaching, playing random gigs, doing essentially anonymous studio work, and
odd jobs. Music is a very hard business. This reality seemed mostly lost on
the majority of the students. I decided that I was probably happier to have a
"real" job and play music on the side.
~~~
antareus
I'm considering the same jump (real job -> music school for a bit). Where did
you go? What'd you make of it? Did you switch out? The first sentence suggests
that you did. I'm perfectly happy to play local venues for the rest of my
life.
------
Zev
_Stay offline. Shut off your computer. Stay in the shed._
I bookmarked this and stopped reading after that. Nice reminder to get back to
coding for me.
~~~
baddox
But not before zipping back to HN to submit that comment. ;)
------
unwind
I had huge trouble understanding this, before Googling "Berklee" and realizing
it's short for "Berklee College of Music".
That lead me to think that the author didn't in fact start Berklee, he started
_at_ it. That is not communicated in the title, and also the post itself
contains sentences like this:
_Luckily, when I was 17, a few months before starting Berklee, I met a man
named Kimo Williams who used to teach at Berklee and convinced me that the
standard pace is for chumps._
So; can we please have an "at" in the HN title, at least? :)
------
osuburger
While I can respect what the author is saying, I don't think everyone should
follow this advice. While I've definitely had my fair share of time spent in
the "shed", working on projects for both school and my own side ideas, I don't
think a true college experience can be had by being like this all the time.
There is nothing wrong with occasionally being distracted by your peers; I've
had lots of great nights going out for a couple drinks on a Wednesday night
just because I can. In the end, it all has to be about balance in my opinion.
Definitely go (far) above the bare minimum, but I know I could never stay sane
without the occasional break or fun night out.
------
alexophile
On point #2 he references his training under Kimo Williams, which he wrote
about at length last year:
<http://sivers.org/kimo>
------
zzzeek
He is right you need to shed a whole lot more than you might be motivated for.
But also, success in the field of music requires a level of social
assertiveness and competence that is way beyond what it is in technology.
Nobody cares about your cool grooves or whatever, you have to fight to stay on
board. But oh you can build my ecommerce site for me ?
I got out of Berklee in '92 and basically dicked around trying to get non-
shitty gigs for a few years, not going to enough jams and auditions, until an
ocean of interest and money came at me to do anything related to computers,
after I had sworn them off to be a musician. I was interested in eating and
not living in a box. The market decided for me on that one - scratch and claw
your way to get some real music gigs, or step into this plush world of "wow
you can program ?". Wish I could play again.
------
sgoranson
Disagree strongly with #6. It's too easy to find counterexamples of brilliant
artists who've created immeasurable value and died penniless. Market value !=
intrinsic value.
~~~
ScottWhigham
This is fairly fascinating to me - I'm a musician as well and love this stuff.
What would have happened if John Coltrane, for example, had done what Miles
Davis did and focused on the business side more?
I wonder, though, if perhaps you are nit-picking on a pithy title? "Be
valuable" doesn't have to mean "learn business at the expense of creating
value in your music", does it?
~~~
sgoranson
I think he was pretty clear with "Making sure you're making money is just a
way of making sure you're doing something of value to others."
Maybe it's like this: getting paid implies your work has value, but your work
may value even if you are not getting paid?
~~~
derefr
Your work _may_ have value, but you won't know for sure; you might just be
heading down the wrong path and producing crap. Getting paid gives you an axis
to measure your efforts upon, which gives you a direction to hill-climb.
------
baddox
> _Berklee is like a library. Everything you need to know is here for the
> taking. It's the best possible environment for you to master your music. But
> nobody will teach you anything. You have to teach yourself._
Sounds _exactly_ like a library, except (I presume) extremely expensive. I
only went to college because I assumed (correctly) that at least a few great
minds would be there. What's the upside to Berklee?
~~~
coliveira
the same: "few great minds would be there"
------
sayemm
I freaking love this, thanks for posting it
A ton of great lines in there, as Derek Sivers is an amazing writer jam-packed
w/ wisdom much like PG, but this is my most fav one out of the pack:
"But the casual ones end up having casual talent and merely casual lives."
------
tomjen3
>When you emerge in a few years, you can ask someone what you missed, and
you'll find it can be summed up in a few minutes.
>The rest was noise you'll be proud you avoided.
Yes -- almost, but you will properly feel that there are one or two things
that you didn't experience that you will miss not being a part of.
~~~
dreaming
Exactly. Important not to overlook the benefits of meeting like minded people
who can help inspire you, or just keep you sane.
------
deutronium
Really loved that post.
Especially the quote "The casual ones end up having casual talent and merely
casual lives."
~~~
Gianteye
I'm not sure about that. There are plenty of boring jobs to be had, and quite
a few of them are to be had at Google. I suppose banality and life
satisfaction aren't mutually exclusive, but it's the case for me. Doing
computational database analysis whether at Facebook or at the local grocery
store for me is a bitter kind of hell.
I think I agree with the point, but would refine it. Both talented and
talentless people have the option of living boring lives. The more talent you
have, and the more willing you are to focus and direct it, the more leverage
you have to launch yourself into a fascinating and entertaining lifestyle.
------
thefool
The don't get stuck in the past bit is a fine line you have to walk.
Its dumb to spend your whole creative life simply reproducing ideas that
seemed obvious decades ago. You can get a lot better if you know what other
people did, and then consciously build on it.
------
rb2k_
> In just 3 intensive lessons, he taught me 3 semesters of Berklee harmony, so
> on opening day I started in Harmony 4.
> In one intensive lesson, he taught me the whole semester of Arranging 1
I don't know if that actually says something about Berklee or about Music...
------
tibbon
Seeing my alama matter on the top of HN was unexpected. Sounds like someone
learned a great deal of life at Berklee- unfortunately many don't.
~~~
narag
Excuse the nit picking, I hope you'll like to know: _alma mater_ is Latin for
"feeding mother".
------
LiveTheDream
"Do not expect the teachers to teach you."
I am all about teaching yourself and internal thirst for knowledge, but this
is a bit depressing.
------
zackattack
his tales about kimo make me wonder why they don't hire kimo to come in and
restructure their courses.
------
jeberle
Certainly one of these things should have been that the school was founded by
Lee Berk, who thought the name "Berklee" was good in light of UC Berkeley!
~~~
jeberle
[http://en.wikipedia.org/wiki/Berklee_College_of_Music#Histor...](http://en.wikipedia.org/wiki/Berklee_College_of_Music#History)
------
nolite
this is my new hero
~~~
lanstein
this is my old hero
| {
"pile_set_name": "HackerNews"
} |
Ruby on Sails: a homebrew Google Wave provider - mcantelon
http://danopia.net/posts/12
======
elblanco
Neat! Hopefully a good example of how Wave might have a quick uptake.
Relatively easy to write for and against might mean a fast adoption rate.
------
catch23
Sounds like Ruby on Snails for some reason...
| {
"pile_set_name": "HackerNews"
} |
Engineers, come get your $250K salary - amduser29
http://www.cnet.com/news/silicon-valley-talent-wars-engineers-come-get-your-250k-salary/
======
7Figures2Commas
The fine print:
> Not every Weeby engineer will earn the million dollars. Weeby will subject
> everyone to monthly performance reviews, and managers will make quick
> decisions, either granting the next $10,000 raise or offering feedback about
> needed improvement. Some will be terminated but given a "healthy severance"
> of at least $20,000, plus references.
This structure sounds like a recipe for stress, low morale and high turnover.
And why in the world would the company promise references to employees it had
to fire for not meeting expectations? "John is a great programmer. He made it
to $150,000/year in annual compensation before we determined that he wasn't
worth $160,000/year and had to terminate him. He's a steal if you pay him less
than $140,000/year!"
------
Iftheshoefits
I'm mixed about this. On one hand, $250k/yr seems like it should be a bit
above market rate for the value engineers provide their companies. Even
mediocre engineers provide far more value than the typical market rate. Market
rate for mediocre or even slightly below average talent should be much closer
to the mid-$150k range than it is. Good or excellent engineers should already
have base compensation around the high $100k/low $200k range anyway.
That said, this company is a game company at its core, and I would expect
these salaries and perks to come at quite a high price in other terms
(particularly work-life balance).
~~~
alexanderss
Can't say I agree that "mediocre or even slightly below average talent" should
be compensated in the "mid-$150k range" (is that a way of saying $155k?), so
close to excellent engineers. This seems to misunderstand how market rate is
determined (especially relative to equity compensation), conflate "market
rate" and "the value engineers provide their companies," or defend high
compensation for mediocre performance. The logic is flawed regardless, as
"mediocre or even slightly below average talent" will actually provide the
company with negative value, costing the software team and the company much
more than they offer.
~~~
Iftheshoefits
I meant the "mid-$100k" range, as in around $150k/yr. I understand how market
rate is determined. Frequently it's determined by companies' explicit
collusion or other means, and almost never by an honest appraisal of the value
an engineer actually provides. I was expressing the opinion that the model is
flawed, and that market rate should be much higher than it currently is,
across the board. That is to say, good to excellent engineers should be in the
$200k-$300k/yr or more (base--not all in) range, and average (+/-) should be
around the $150k/yr (base) range. At 33% above the "mediocre" range I suggest,
$200k/yr isn't "so close" to good at all.
Also, the enormous quantity of applications, products, and services with very
poor code backing them is a very strong indicator that "mediocre or even
slightly below average talent" not only does not provide negative value, but
just the opposite. Software companies these days have a lot of revenue, driven
by sales _of a software product_ that engineers get very little (relatively
speaking) compensation for.
------
acornax
Isn't increasing pay like this just part of how the market rate adjusts? In
this case it seems a bit extreme but it seems obvious that if a company is
having a tough time attracting talent they should pay more.
------
mkaziz
This reads like a press release, and not like real reporting ...
------
elwell
> Weeby.com's Michael Carter
.co _
| {
"pile_set_name": "HackerNews"
} |
Balsamiq - A look back at 2008 - kapitti
http://www.balsamiq.com/blog/?p=531
======
maximilian
He had over a $100,000 in revenue, which is pretty cool for 6 months. He says
he's started paying himself a bit, but all that money is his, so its all
income as I see it. Other than a bit of server space, what are the costs?
Obviously he's going to use some of it to pay for server space for his future
prospects, but thats a pretty penny nonetheless.
~~~
balsamiq
Hi Maximilian, here's a breakdown of my top expenses: \- Salary: 13,500 (46%)
\- Equipment: 5,400 (18%) I bought an iPhone and a 23'' Cinema Display this
year \- Contractors: 3,500 (12%) I hired 2 devs for 2 different small projects
\- Lawyers: 2,780 (9%) I hired a lawyer to help me incorporate and with the
EULA
The rest is small stuff.
Yes all the money is technically mine, but I have started paying myself
because I want to keep the rest for taxes and to re-invest in the company.
~~~
tdavis
Next step: get an awesome accountant and start using that money to purchase
anything and everything you could possibly write off. All that excess revenue
will get raped by taxes.
~~~
jotto
No accountant can be awesome enough to write off much more than what is truly
a business expense. The ultimate rule is that everything written off must have
business purpose.
Now, you'll only get in trouble if you get audited, but writing off things
that shouldn't be written off will stand out.
The best way to shelter yourself from taxes is to invest the money and then
pay taxes on a long term capital gain (which have historically been lower than
ordinary tax rates).
~~~
fallentimes
Retained earnings (if that's what you mean by invest) are taxed as well.
------
Jem
I had an excuse - sorry, opportunity - to use the online mockups app this year
to generate some basic example page layouts for my boss. I was impressed by
the speed at which I was able to just throw things together (normally I spend
about 3 hours trying to remember how to use MS PowerPoint.. ick!)
All I have to do now is convince the boss I need the desktop version!
------
lionhearted
Peldi, I've really enjoyed hearing about your journey and congratulations on
your successes. I've learned a lot reading your entries.
In your 2009 outlook, you wrote some goals, one being: "I will have to go from
'mr. Do-it-all' to be Balsamiq’s CEO, which will mean having employees,
delegating..." - one thing from personal experience. When you make your first
hire, you're going be tempted to get a friend you know is good. My advice:
Don't. Your first hires will always see you in the light they had before you
were their boss, and since you're growing so rapidly in business and life,
it's 95% likely to end poorly.
It won't be the end of the world, but I disregarded this advice myself a
couple times. Staff will learn and grow with you to some extent, friends to a
much less extent. And since you seem like a really nice guy with a soft spot
for people, then being in the super difficult position of it not making sense
to keep working with someone - but you're their livelihood - will suck more
for you than it will for a more heartless person.
Your call of course, just have a think over it, and best wishes and much
prosperity in 2009.
~~~
balsamiq
Thanks so much for the advice lionhearted, I'll remember it!
------
zhyder
Congrats Peldi! I'm surprised how dominant the desktop version is over the
other web-platform versions. I think I read in one of your blog posts that you
were too.
~~~
old-gregg
I am not surprised. In fact the availability of offline version is the main
reason my clients are considering balsamiq. Anything online is inevitably slow
on corporate Internet and unavailable in some meeting rooms, airports or
anywhere on the go. Online apps are fine for time wasting, but when you have
work to do and a paycheck to earn, most people prefer to have their tools
available 24/7.
------
Jasber
This is such a cool story to watch unfold.
One concern I have is there might be ___too many_ * SKU's:
<http://www.balsamiq.com/products/mockups/pricesheet>
While offering a variety of prices is great, I'm curious if some users don't
become overwhelmed.
~~~
balsamiq
Hi Jasber, the link on my post was actually the first time I shared the
pricesheet page with the public. I never show it to potential customers, but
had to make it for partners and resellers (they really like to make sure they
buy the right SKU for their clients). In other words, people get to the
version they need via the site, which is hopefully clear enough to navigate
(I'm not super-happy with it, but it works ok).
------
bootload
_"... was wondering how long you plan to stay a one-man company for. If you
are in an accident, who will support us?“ .."_
Peldi did you overcome this problem?
_"... I am still confident that the plugin versions will grow over time
relative to the desktop version, as more and more people “see the light” and
start working in the cloud. ..."_ ~ <http://www.balsamiq.com/blog/?p=424>
I think the biggest insight this game me was that desktop version had a bigger
demand than the web version & that you could ship both using the same code
base.
Is shipping the desktop version & web version difficult?
~~~
balsamiq
Hi bootload. Re: the "hit by a bus" scenario, I'm working on it on two fronts:
one is hiring a second person, which I will need to do soon. The second is to
entrust the company to my family in case I die, so that they can sell it if
something happens to me.
Re: desktop vs. web, yes it's a surprise, but as I say above, a desktop app
which syncs to the web is a pretty good proposition...we'll see more as I
build my "web connectors" this year.
| {
"pile_set_name": "HackerNews"
} |
Seth Godin on why he isn't naked on stage & great marketing advice. - marklittlewood
http://thebln.com/2011/09/seth-godin-on-why-he-is-not-naked-on-stage-other-more-useful-insights-video-talk-transcript-from-business-of-software/
======
DarrenH
Transcript alongside the video is a great way of both experiencing the talk
and also digesting and refering to snippets of real interest when sharing with
others.
~~~
marklittlewood
Thanks. They are far from perfect but do give a good feel.
------
MarcinZ
I thought that the transcripts were a very interesting read. He raises some
very interesting points. Not that many issues transcribing verbatim either.
| {
"pile_set_name": "HackerNews"
} |
Watch the Inspiring Movie ‘CODEGIRL’ for Free on YouTube Until November 5th - doppp
http://techcrunch.com/2015/11/01/watch-the-inspiring-movie-codegirl-for-free-until-november-5th/
======
xteo
When i was a much younger lad in my schooling days, programming was not a
popular hobby. It was not one you broadcast to your peers, and certainly not
one that you took pride in. What you created, was yours. And if you were lucky
enough to have surrounded yourself with friends that understood your
obsession, you could share with them what you were learning.
One thing i observed, is that ignoring the social conventions of what you are
"supposed" to enjoy, is much easier for prepubescent and pubescent males than
it is for females of the same age bracket. Females stress far more over
'fitting in', and the approval of others. Unfortunately, this means that in
those critical formative years, females are missing out on the growth and
development needed to make great programmers. Great programming requires long
established brain development in the required disciplines, a deeply laid
foundation of logic, and this is best established very early on.
Much of the bias in information technology, is not because of discrimination
or sexism, but because females aren't in an environment that cultivates the
necessary skills. This is not necessarily something that requires intervention
from educators, or politicians. It's something that teenagers need to value,
and to appreciate.
When i was a young lad, anti-intellectualism was a religion, and was heavily
practiced in the social fabric i was required to navigate. I'm unsure if that
trend has been corrected, but i am sure that we're seeing the side effect of
that in the professional landscape with the amount of qualified females with
an interest in the STEM fields.
| {
"pile_set_name": "HackerNews"
} |
Tesla: Solar Car Roof Idea Is All but Dead - pulse7
https://stocknews.com/news/tsla-tesla-inc-tsla-solar-car-roof-idea-is-all-but/
======
pitaj
Sounds about right, to me.
Solar cells are heavy, require maintenance, and the small amount of area on
the roof of a car is unlikely to improve real world driving time, especially
since the cells will be at an angle to the incoming sunlight most of the time.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: How Many HN Readers are going to the Startup Weekend at SF today ? - code_devil
http://sf2startupweekend.eventbrite.com/
Twitter HashTag: #swsf09<p>- if they are going, what's in their agenda ?<p>- if not, what would they like the products/projects come out of this ?<p>ps: I am going, will probably try to build something over FB platform or something that use's Twitter.
======
code_devil
Twitter HashTag: #swsf09
\- if they are going, what's in their agenda ?
\- if not, what would they like the products/projects come out of this ?
ps: I am going, will probably try to build something over FB platform or
something that use's Twitter.
| {
"pile_set_name": "HackerNews"
} |
The Nerd as the Norm - paulpauper
https://everythingstudies.com/2017/11/07/the-nerd-as-the-norm/
======
Mysterix
From the first link, this part is so relatable :
"They'll stop going to the company picnic if it becomes an occasion for
everyone to list all the computer problems they never bothered to mention
before."
------
pasabagi
I don't really think his concept of a nerd works, since he lumps in lots of
extraneous charateristics and values.
I think a more explanatory description would be, a nerd is somebody who is
primarily interested in technical questions. That predisposes nerds to avoid
ambiguity - but by no means excludes nerds like literature professors, who are
absolutely obsessed with ambiguity.
It's generally a better idea to categorise people by priorities, as opposed to
preferences - since preferences tend to be very variable.
~~~
nickthemagicman
I think there's several sub-categories in the nerd culture. Geeks, Dweebs,
Nerds, etc.
I think Geeks are the literature/pop culture equivalent of nerds. Whereas
nerds deal with more scientific obsessions Geeks deal with more cultural
obsessions.
Here's a Nerd/Dweeb/Geek Venn diagram.
[https://www.popsugar.com/tech/Geek-vs-Nerd-vs-Dork-vs-
Dweeb-...](https://www.popsugar.com/tech/Geek-vs-Nerd-vs-Dork-vs-
Dweeb-8177870)
------
bootsz
> _It would be nice to have nerdy interests and sensibilities be the norm for
> once; to get to feel as if society is organized with me in mind, and not
> feel a bit like an anthropologist observing an alien civilization._
For real though...
------
thepra
"Nobody is going to respect you for having feelings."
Funny, the inverse world.
------
marcus_holmes
I love the idea of a name for non-nerds, but don't know how to pronounce
"wamb" \- is the b silent as in lamb?
~~~
peterburkimsher
Perhaps "dren" would be a better term?
~~~
soylentcola
Such vulgarity!
------
AstralStorm
Why even try to show there is a "norm" as opposed to a multimodal distribution
of traits?
These are probably different peaks in the spectrum. Unfortunately, this truth
runs counter to people who want to control everyone... or at least predict.
The description of the anti-nerd would completely fall apart in Asian culture
for instance.
------
nickthemagicman
Great article. Very clever the flipping of the traits.
------
powerslacker
really interesting piece.
| {
"pile_set_name": "HackerNews"
} |
Visual Explanation of the Conjugate Gradient Algorithm - mercurymercury
https://pwacker.com/CG.html
======
nimish
I always thought it was simpler to explain by using the only trick linear
algebra has: switch to a basis where your problem is easiest, then switch
back.
Sylvester's law of inertia proves existence + gram-schmidt constructs that
change of basis
~~~
zzleeper
I was recently re-reading how to solve systems via QR and realized that the
key trick was switching basis. But it took me a while to understand that, in
between all the extra stuff. Now you are telling me that most of LA is like
that and wow, so many things make sense now!
Do you have any additional linear algebra tricks?
~~~
nimish
Linear algebra done right is IMO the best book on it.
But a lot of it is simply exploiting linearity to reduce to a working in the
most convenient subspaces. Finding and constructing them is a major task.
------
jl2718
CG, most elegant of quadratic methods. Now let’s talk about why you wouldn’t
use it. 1. In the form given, you need to know the terms of the quadratic. 2.
If you know the terms, you can compute a stable point using QR, which is more
efficient. 3. IIRC CG has better numerical stability than QR, but only under
strict quadratic assumptions. 4. If you don’t have the form of the objective,
but you know it’s quadratic, and you can compute a gradient, then it also
works, but errors in the gradient compound. 5. If it’s not quadratic, then you
have the same issue. You can try to account for this using resets or more
complicated methods, but then, 6. For so many problems, the stability proof
becomes too onerous or the evaluations become less efficient than simply doing
more steps of gradient descent, which is why this is still the dominant method
in neural networks, despite so much effort on ‘our’ part to find more
efficient solvers.
------
vladTheInhaler
Here is another resource I came across when learning about the conjugate
gradient method for a class on finite elements. I wish I had found this back
then!
[https://www.cs.cmu.edu/~quake-papers/painless-conjugate-
grad...](https://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf)
~~~
clemParis
Thanks ! Are there other great math papers or articles with a similar "without
the agonizing pain" approach, for other topics that might be interesting ?
| {
"pile_set_name": "HackerNews"
} |
Could An Omnipotent Being Prove It? - robg
http://www.juliansanchez.com/2010/10/04/could-an-omnipotent-being-prove-it/
======
ajuc
If this being is omnipotent, it should be able to prove anything. Otherwise it
is not really omnipotent (for it can't do this one thing).
BTW - talking about omnipotence using logic is useless - omnipotence can't be
described by logic, because it leads to paradox (can omnipotent being do
something, that omnipotent being can't do?).
Anyway - interesting questions.
| {
"pile_set_name": "HackerNews"
} |
About coding the“FizzBuzz” interview question - micheleriva
https://www.jsmonday.dev/articles/30/about-coding-the-fizzbuzz-interview-question
======
rvz
Good article, but you forgot one more thing: Readability.
Given that there are zero comments in the code, you are giving the impression
of "The code is the documentation" attitude which forces the programmer to
infer via the function signatures only, which compromises clean-code and
maintainability. It is important to explain what these functions and one-
liners do, as a different set of eyes might not easily understand what it is.
If a candidate did this in a pair-programming session, I'm afraid that I would
not accept this solution unless a reasonable amount of documentation in the
function signatures like JSDoc is present.
------
gregjor
At first reading I thought the author intended satire.
The first two code examples meet the stated requirements, except for an off-
by-one error in both:
for (let i = 1; i < 100; i++) { ... }
Requirement: Print the numbers from 1 to 100. Oops.
The second example confusingly appends to the _output_ variable -- except for
the case of outputting the number -- although that variable gets set to empty
on every iteration of the loop, and only one if/else block will get executed.
Minus one for clarity.
Then the author goes off the rails and keeps going.
_But there is still a big problem about maintainability. What if we want to
add more control structures? Let’s say we want to print “Foo” when a number is
multiple of 7 and “Bar” when it’s multiple of 11..._
Possible future requirements are not called "maintainability." The author
gives a good example of YAGNI, though. Then after calling out this supposed
problem the author does not give an example of how to address the possible
proliferation of if/else conditions, even after calling it a "bad, bad, bad
idea." Instead he shows an even less idiomatic and less clear _switch_
construction that _has exactly the same problem_ , with even more possibility
of error due to leaving out a _break_ by accident. The supposed
maintainability problem -- what if we have to deal with 7 and 11 in the future
-- never gets addressed.
Then it gets worse. To avoid the for loop, which has one local variable, the
author creates an array merely to iterate over it. Let's think about how that
scales to doing FizzBuzz from one to ten million, since the author brought up
scalability (although that wasn't an original requirement).
_But now… let’s make the things harder. How would you approach the same
problem without using loops?_
The problem explicitly describes a loop from 1 to 100. But...
_That is a tricky question and will show the candidate deep knowledge of that
particular programming language. I mean, there are just a few programming
languages that doesn’t support loops, so we’re all pretty addicted to them!_
Sorry, _map()_ is just a loop, a loop with function call overhead.
This code has another off-by-one error, apparent if you just run the code in
the browser. It assigns the local variable _num_ to _i + 1_ , since array
indices begin at 0, but uses _i_ in all of the calls to _isMultiple()_ when it
should use _num_. Running the code I get FizzBuzz as the very first output
line. Note that the _for_ loop solution didn't have this problem, but now the
details of Javascript array indexes have surfaced into the code and become a
new problem to think about.
This bug -- using _i_ when the parameter is named _num_ \-- carries into the
final example, which throws an error right off. That tells me the programmer
didn't test this code, or even read through it carefully.
Looking at the test suite I see that it doesn't check for boundary conditions,
like passing 0 to _isMultiple()_. I would want to discuss what _isMultiple()_
should do when passed a 0.
At this point the code bears almost no resemblance to the original problem
statement, so as a hiring manager I would give a C- for a whiteboard exercise,
and an F for a published article. I've seen several off-by-one and variable
naming errors in the iterations, which looks careless, especially for code I
would expect the programmer to write iteratively in a JS console. I've seen
simple code balloon into something complex and unreadable, and I can't just
copy/paste it into a Javascript console anymore.
I wanted to see if the candidate can code a simple problem _at all_ , not for
a demonstration of increasingly obscure techniques and scope creep. This
programmer may know Javascript, but they are going to drop spanners into every
task I assign, and waste other team members' time polishing turds like this.
~~~
gregjor
What I expect to see as an interviewer:
for (let i = 1; i <= 100; i++) {
let output = "";
// test modulos in ascending order
if (i % 3 === 0) output += "Fizz";
if (i % 5 === 0) output += "Buzz";
if (output === "") output += i;
console.log(output);
}
Look Ma, I can maintain it!
for (let i = 1; i <= 100; i++) {
let output = "";
// test modulos in ascending order
if (i % 3 === 0) output += "Fizz";
if (i % 5 === 0) output += "Buzz";
if (i % 7 === 0) output += "Foo";
if (i % 11 === 0) output += "Bar";
if (output === "") output += i;
console.log(output);
}
Since I didn't introduce any new functions like _isMultiple_ as a unnecessary
shorthand for the built-in modulo operator, and I didn't introduce unnecessary
arrays to iterate over the index of, I don't need to write a test suite. I can
simply compare the output to the expected results.
Not having the function call for every value will help this "scale" if
necessary, though for this problem scaling was neither stated as a
requirement, nor implied by any reasonable reading of the requirements.
Putting on my hiring manager hat, if a candidate changes a _for_ loop to a
_map()_ because "there are just a few programming languages that doesn’t
support loops," I would ask the candidate to name some of those languages, and
how one could solve a problem like FizzBuzz without loops. I would expect to
talk about recursion and declarative languages (Prolog, or even SQL), not
about `map()`.
| {
"pile_set_name": "HackerNews"
} |
Smooth animation on (kindle model) e-ink display - pedalpete
http://www.engadget.com/2011/02/16/bookeen-shows-off-fmv-on-a-standard-e-ink-pearl-display-video/
======
HelgeSeetzen
I held a colour 60Hz electrophoretic display a decade ago (first startup).
Electrophoretic displays can go at very fast refresh rates if you just zap
them hard enough (i.e. high voltage on the TFT). Without higher voltage they
can still run fast but at the expense of losing contrast. You basically have
little black and white "balls" (electrophoretic particles) travel up and down
in the field (in those little e-ink cells). At conventional voltage the balls
won't travel the full distance in a single fast image frame, so you won't get
full black or full white. The problem is that people want colour, speed and
low power consumption. Those are opposing requirements and E-ink made their
bet on power (at the expense of colour and speed).
| {
"pile_set_name": "HackerNews"
} |
SL0T0X – A modular penetration testing interface - slotleet
https://github.com/Slotleet/SL0T0X
======
alphachloride
It looks like a note-taking app with file attachments. Are there any docs or
detailed description?
| {
"pile_set_name": "HackerNews"
} |
Freenode and irc.com - KindOne
https://freenode.net/news/freenode-irccom
======
KindOne
Previous discussion about it.
[https://news.ycombinator.com/item?id=17375831](https://news.ycombinator.com/item?id=17375831)
| {
"pile_set_name": "HackerNews"
} |
Lawsuit Exposes Internet Giant’s Internal Culture of Intolerance - YouAreGreat
http://quillette.com/2018/02/01/lawsuit-exposes-internet-giants-internal-culture-intolerance/
======
qbaqbaqba
Oh no, another interesting news concerning google getting immediately flagged.
Googlers, please don't be evil, at least provide your motives.
~~~
pentae
The motive seems pretty clear - cover up, accuse, and be a hypocrite.
------
ben_jones
These two statements struck me:
> At a “Diversity Team Kickoff” event, a director announced plans “to freeze
> headcount so that teams could find diversity candidates to help fill the
> empty roles,”
> Next time you get invited to speak at a conference, especially if you’re a
> white male – ask the organizer to confirm you’re the only white male on the
> panel / in the speaker lineup. If not, say you are honored, but must
> decline, and give the reason. And because you are at Google, guess what –
> they’re going to change the panel for you.
~~~
xkcd-sucks
Sounds like a really good tutorial on leveraging your privilege to marginalize
someone
------
rhapsodic
Damore's lawsuit is a huge story in the tech industry, yet, for some reason,
stories about it always get flagged on HN.
------
rhapsodic
It' frightening to learn how many hateful, vindictive people have settled into
positions of power at one of the most powerful companies in the world.
------
gorbachev
The comments on that article remind me why I should never read comments on
online articles.
------
jacksmith21006
Maybe an unpopular view on HN but I have no problem with Google letting Damore
go.
Work is to do work and in the US you often times do not even know who the
person one cube over voted for.
Damore shared his views on Reddit without using Google name is fine but at
work and multiple times even after told to stop is going to be a problem, imo.
~~~
mankash666
If Google treated all poiltical discourse within it's walls like it did
Damore, there probably wouldn't be an issue. The whole problem is with
favoring one type of speech/thought in a militant fashion that violates
federal law.
And regardless of the law, it's UNETHICAL to discriminate in the name of
diversity. In 2018, companies are expected to do the right thing. This article
paints a very damaging picture of Google discriminating against white males
(Disclaimer: I'm NOT a white male)
~~~
nezzle
> it's UNETHICAL to discriminate in the name of diversity.
I think Amazon disagrees with you. They include this in their job postings:
>Amazon is an Equal Opportunity-Affirmative Action Employer – Minority /
Female / Disability / Veteran / Gender Identity / Sexual Orientation.
I don't think you can be both equal opportunity and affirmative action at the
same time.
~~~
mankash666
You've completely misunderstood & mis-represented my viewpoint. Equal
Opportunity-Affirmative Action IS the law of the land, and that's the same law
that forbids setting artificial quotas for ANY group that discriminates
against others. For instance, some Google execs called for a 25% minority
quota - THAT is discrimination against other groups, especially if the overall
statistics in the talent pool/job market make it impossible for the said quota
to be fulfilled in a manner compliant with "Equal Opportunity-Affirmative
Action"
For instance, you might want to setup a workforce of coders with a 50-50 Men-
Women breakdown. But since STEM graduation rates indicate a Men:Women ratio of
85:15, it is unlikely, if not impossible to construct a 50:50 workforce
without willfully discriminating against men. If you now force the entire
industry to adopt the 50:50 rule with a 85:15 talent pool to choose from,
you're both discriminating AND defying logic/math!!
In summary - no corporation can possibly disagree with me for I'm calling for
the law to be followed as intended.
| {
"pile_set_name": "HackerNews"
} |
China won’t listen to West about genetically modifying the human embryo - prostoalex
http://qz.com/441423/why-china-wont-listen-to-western-scientists-about-genetically-modifying-the-human-embryo/
======
venomsnake
A little macabre humor:
Battle of day - Chinese ubermensch vs western rogue AI controlled drones.
| {
"pile_set_name": "HackerNews"
} |
Charter Cities - mhb
http://chartercities.org/blog/33/a-charter-city-in-cuba
======
tjic
> To help the city flourish, the Canadians encourage immigration. It is a
> place with Canadian judges and Mounties that happily accepts millions of
> immigrants. Some of the new residents could be Cuban émigrés who
So the Canadian government, despite having no authority and no mandate to do
so, decides to start administering an overseas possession ... except it's not
really a possession.
Canada does this ... _why_ ?
> Initially, the government of Cuba lets some of its citizens participate by
> migrating to the new city.
The Cuban government, like the East German government, chooses to _murder_
people who try to emigrate. When given a choice, people leave dictatorships,
and because dictatorships are slave societies built for the benefit of the
rulers on the backs of the citizens, this makes perfect sense. Farmers don't
want their livestock wandering free, and neither do dictators.
> With clear rules spelled out in the charter and enforced by the Canadian
> judicial system, all the infrastructure for the new city is financed by
> private investment.
Why do private investors want to take a gamble on such a crazy scheme?
Why not just invest their money building infrastructure in places where people
_already_ choose to live?
> The structure of the charter could be very different, perhaps with several
> partner nations in place of just one. The benefits could be just as large.
Huh?
Look, I'm a lunatic libertarian. I'm in favor of oceansteading, L-5 colonies,
anarchocapitalism, etc. ... so when I say that something sounds like a really
ridiculous dope-smoking idea, that _means_ something...
~~~
anamax
You're forgetting something. Some people really, really want Cuba to succeed.
That's why they're not proposing to set this up in, say Haiti.
Since they're essentially arguing for "we're going to show the savages how to
do things", Cuba is a reasonable choice because it already has police state
apparatus.
------
mhb
TED talk: <http://www.ted.com/talks/paul_romer.html>
| {
"pile_set_name": "HackerNews"
} |
Hire people who aren’t proven - type12
https://leonardofed.io/blog/startups-hiring.html
======
maxxxxx
I think hiring has become more difficult now that programming has been
discovered as a well paying mainstream career. When I started in the 90s most
people I worked with had a passion for the craft but now I find we interview a
lot of people who have a CS degree just for the career prospects but not out
of interest for the craft.
I find it much easier to deal with someone who has no relevant experience but
cares vs someone who had 10 years experience but doesn't care. Now someone who
has 10 years AND cares is rare but pure gold.
~~~
snowwrestler
By all means, you should use whatever criteria you would like to hire your own
staff.
But in general, I would encourage folks in computer tech industries to be
hesitant to assume such a binary approach to evaluating prospects. "Did you
code for fun in high school?" might be a useful question now, because software
development as a field is so young that high school students can try out
significant work easily.
But in most high-end professional fields, that's not true. Imagine asking a
prospective medical resident "did you treat any diseases in high school?" or
"how much surgery do you do in your spare time?"
I think we should take it as a positive sign that people are approaching
software and computer technology as a professional career that they can
professionally pursue. That approach can produce great work too, and a growing
field means a greater diversity in how people find and display their passion.
Getting a CS degree is not easy, and requires some base level of interest and
commitment to complete. Even a bootcamp is not free, and takes some focus and
work.
Anyone above a certain age (like me) came into the industry sideways, as it
was developing, as a result of personal passion and interest. Let's not
mistake that for an inherent property of a good tech employee... every
industry on Earth started that way at some point.
Computer technology industries are maturing, just like railroads, oil
development, aviation, telecommunications, and dozens of other industries have
over time. That's not bad, and we will have to take it into account as we
consider the model employee.
Hiring people who are "not proven" might also mean hiring people who are well
trained, but maybe have not yet proven their passion in a way you recognize.
~~~
namanyayg
Doing software development in high school requires a computer (even a ~$400
that is shared with the family is sufficient), interest, and time (so one can
spend hours required reading tutorials/watching videos). Basically, nothing
too much out of the reach of your average teenager -- with the exception of
interest/dedication to the art.
Doctors, and most other professions, cannot be practiced without a lot more
investment. If we lived in a society where access to resources required for
practicing those careers would be easily available to teenagers, we would see
young high school students do that too. I don't think you're making a fair
analogy here, there is basically no way a high schooler could even _attempt_
treating diseases but it's very possible for a high schooler to program 6-8h a
day and learn.
~~~
Xylakant
> and time (so one can spend hours required reading tutorials/watching videos)
This is maybe not out of reach for an average teenager, but it is definitely
out of reach for quite a substantial portion of the population, the poorest
percentile might just not have the time to dabble in programming. They might
have to take care of siblings, work for a living or to pay for college later.
~~~
snowwrestler
Right, looking for signals of "passion" has to be done carefully so it doesn't
become just a proxy for socioeconomic class. This is also true when thinking
about "cultural fit" within a company.
~~~
munificent
This is an interesting case where the laws around interviewing may make this
harder.
What you want to measure in the interview is passion. What you can see is
their accomplishment. But accomplishment is roughly passion × means. A rich
person can accomplish more for the same effort because they have more power at
their disposal.
But you can't easily cancel out means in the interview because you sure as
hell can't ask any direct questions about their socioeconomic level, for good
reason.
------
laurentl
> 1\. Can this candidate do the job?
I'd go so far as to ask: "can this candidate _learn_ to do the job?". In our
recent job postings, I've started adding a specific paragraph after the
desired qualifications stating more or less "if you don't tick all the boxes
above but are motivated to learn and grow, please apply. We'll teach you what
you need to do the job"
> 2\. Will this candidate be motivated?
This. Even more than question 1. I've had a lot of good surprises with
motivated candidates who didn't have all the expected qualifications. Some of
my best hires were people whose CV didn't line up with what I was looking for
but who demonstrated impressive motivation to get the job. On the other hand,
I've often been disappointed with "perfect" candidates who didn't have the
right mind-frame for the job. (Note: I'm not saying I'm expecting slavish
devotion and "giving it 200%" every day of the week. But if your personal,
intrinsic motivations don't align with the job's responsibilities, it won't
work).
> 3\. Will this candidate get along with coworkers?
Duh. But also, how do you actually test objectively for this without
introducing bias? And how do you insure you're not creating a monoculture that
will eventually harden into navel-gazing and dogma? I'm really of two minds on
this topic.
> 4\. What this candidate will be in three, six, twelve months from now?
Related to the rephrasing of 1. Can the candidate learn and grow? And will we
provide the right environment for them to grow?
One of my favorite quotes is from an ex-manager who hired me when I was far
from ticking all the boxes on the job description. "If you hire someone who
has all the necessary qualifications for the job, they'll be bored in 6
months."
~~~
tomnipotent
>> 3\. Will this candidate get along with coworkers?
> Duh. But also, how do you actually test objectively for this without
> introducing bias?
You can't because it's a bs metric like "culture fit". After about a dozen
people, you can no longer ensure people will get along or like each other. I
only have five brothers but I don't even like all of them.
People have quirks, and it's easy to find reasons to pass on candidates
because of them (reminds me of Seinfeld and how the characters ended
relationships because of man hands or toes). I see this a lot in non-technical
hiring where marketing folk exclaim "I like the candidate, we have great
rapport" only to discover the candidate was subpar.
> I'm really of two minds on this topic.
I used to be, but after suffering through a bout of mental illness that
rendered me an anxious mess when I was once the life of the party really
changed how I evaluate this dimension. Be kind and assume the best, which we
can all agree would be amazing if the tables were turned.
~~~
nomel
I disagree that this is a bs metric.
When I look for personality, I'm looking three things:
* How they ask questions for things they don't understand
The problems I give are directly applicable but I leave a few slightly vague.
Someone really experienced could fill in the missing pieces easily, but
usually this doesn't happen. I then ask them if the problem makes sense or if
I missed anything. If they can't tell me that the problem isn't clear, or they
need more information, then they'll have trouble working with a team trying to
solve and communicate problems.
A mediocre candidate will say that the problem isn't clear with "I don't
understand". A good candidate will explain what they don't understand. A great
candidate will ask for clarification on the vague piece without much fuss.
* Reaction to not knowing something.
I have a hard problem that I give at the end. I clearly tell them it's not
expected to be solved and that I just want to talk about how it might be
approached. If they get angry, say "oh I know how, just give me 5 more
minutes" then stand there blank, etc, then they're not going to fit in a team
that is trying to solve hard problems together.
A good candidate will stay calm and provide any sort of input, ask any sort of
questions, or show any sort of interest. Sometimes people get nervous here, so
I keep this very very lighthearted.
* Are they full of themselves/assholes
This is pretty evident within the first few minutes, and really rare (and
almost always accompanied with a stream of buzzwords after every sentence).
A good candidate won't be an asshole.
~~~
tomnipotent
Everything you've mentioned are shallow assumptions based on brief exchanges
with people. These are not qualities you can effectively suss out in a few
hours through an interview process. You can at best get hints, but just like
how GPA is not an indicator of a students on-job success neither is an
interview - or our conclusion - any better indicators to the candidates
success.
> Are they full of themselves/assholes / This is pretty evident within the
> first few minutes
Not it's not. People can act awkward during interviews, it's a stressful
situation. Some people need to peacock to feel self-confident - I may not like
it, but I'm not going to fuck with their future because of an emotional
reaction I had to how they present themselves. I've hired plenty of "assholes"
that just needed the benefit of the doubt and a chance to grow.
We're not psychologists or therapists, so we should we stop trying to "figure
people out" and decipher complex human interactions & behavior by trying to
bucket them into checkboxes for whether or not someone is good.
Be kind, give the benefit of the doubt. This is someones career on the line,
not a first date. Treat it with the respect and gravity you'd like someone to
give you.
~~~
sound1
Hey, if a candiate is ready to be an asshole in an interview, what are the
chances that he will be an asshole to his team members to just get that raise
or a promotion?
~~~
tomnipotent
He's not being an asshole. He's probably stressed out and doesn't know how to
cope well. This is behavior that can be addressed and improved. After 20 years
and hiring 100+ developers through three exits and an IPO, I've only met 1
actual "asshole" that couldn't integrate and needed to be let go.
The problem is that we're too quick to be offended and we look for excuses.
I'm in the middle of hiring a business analyst, and my COO didn't want to hire
him because he didn't send her a thank you note after an interview (but he did
send one to me). She thought he was an asshole. See the problem with that
train of thinking?
Everyone is an asshole to someone.
~~~
scarface74
_He 's not being an asshole. He's probably stressed out and doesn't know how
to cope well._
Life is stressful, work is stressful. My manager and his manager are
reasonable people but after my second assignment, they said I missed a
requirement and that it was a “big f’ up”. Guess what? I found that refreshing
after dealing with managers that beat around the bush and you had to
constantly try to figure out what they were thinking - or even worse, they
didn’t tell you anything until your review.
I’ve had to whiteboard architecture in front of CxOs, be interviewed by
potential investors, etc. It’s when things get stressful that you really need
people who can keep their wits about themselves.
~~~
tomnipotent
> they said I missed a requirement and that it was a “big f’ up”
It's free to say things, doesn't make it true. Too often we assume because
someone is in a position of power - or is wealthy - that whatever they say
must be true. It's not. Sure, we have to nod our heads and pretend it is to
keep the job but that still doesn't change the reality.
Shit rolls down hill with exponential momentum. I've seen many instances where
a CEO says something innocuous like "I'm a bit disappointed in X" but by the
time it gets to someone that can fix it the management in between transformed
it from a simple comment into a condemnation. People like to exaggerate things
to make something seem more important or impactful than it really is.
There's also a big difference between an interview and dealing with work
everyday. I know plenty of people that shine under the pressure of interviews
but not under the job itself (and visa versa). You cannot determine these
things about a person from an interview. Period. Performance in an interview
has very little correlation to job performance (if any).
> you really need people who can keep their wits about themselves.
I need a diverse group of people that I can collaborate with to get things
done. If they can't keep their wits about, it's my job to deal with that issue
and get them back on track and protect them from organizational crap. Might as
well expect every girl or guy you date to be a model with a PhD.
~~~
scarface74
_It 's free to say things, doesn't make it true_
Well, when the requirement was in big bold writing as one of the key features
on a PowerPoint slide...
_I need a diverse group of people that I can collaborate with to get things
done. If they can 't keep their wits about, it's my job to deal with that
issue and get them back on_
That’s not a luxury you have as you move up the ladder - even if moving up the
ladder is just being a real senior developer/architect (by knowledge if not by
title).
My first job at 22 was as a computer operator trying to get my foot in the
door was within 6 months build a custom, networked data entry system used to
support a completely new department and a new line of business. Working at
small companies, you don’t get the luxury of hiding within the bureaucracy.
The one time that I did work at a large company, it was suffocating.
------
wgerard
My co-founder and I believe so strongly in this that we started a company
around it.
So many companies are willing to let perfect be the enemy of good. It's why
this concept of an MVP has to be hammered in over and over again. And yet, we
still let this mentality pervade hiring. We hire people on the slim chance
that they'll need to re-implement a consensus algorithm and not on the 99.99%
chance that all of their work will be writing CRUD-like code.
We (as in, Headlight) deal with a lot of bootcamp grads and people who are
entering tech later in life (which for tech, means 25+). It's shocking the
number of them who are insanely adept at software development for the amount
of experience they have and are completely overlooked because of any of the
following:
* Their pedigree
* Their experience given their age
* Their program's focus on practical software development and not on more academic topics
It's really mind-boggling. It's great for us, because it's a totally
unappreciated and under-served market. Still, I can't imagine how frustrating
it is for those candidates. We're still new, but so far our clients'
satisfaction with our candidates has been nothing short of enthusiastically
positive.
All that to say, you should really consider adjusting your hiring expectations
drastically. You're building a house. Why are you trying to hire a civil
engineer, and not a contractor?
~~~
davio
Most corporate dev jobs can be effectively handled by someone who can reliably
show up and pull words from a database and display them on a screen.
~~~
mushishi
Yes but it's mind-boggling how complex you can get quite a straightforward
system by just piling new requirement changes on top of old without
maintaining proper data-model in application, even if database is fine-ish,
this results in a monster that slows down the system and the development
speed.
So in principle, a lot of people can do it but only some do it while not
making things more difficult for the next person.
------
docker_up
The problem that has happened at my company is that we take a "chance" on
bootcampers or other people who aren't proven, we spend the time and effort to
mentor and train them, pay them well, and then they turn around in 9-12 months
and leave us for another company with their enhanced experience. It has
happened 3 times so far, so we have stopped hiring bootcampers, and fresh
grads and instead have started hiring people with a couple of years under
their belts or more, since they appear to at least want to stick around enough
to contribute positively to our teams.
~~~
diminoten
Conversely, I found it _really_ hard to establish myself as a solid
contributor and even a technical leader when I joined a team as a (relatively)
newly minted developer.
No matter how far I progressed, or how competent (even excellent) I became, I
was _still_ perceived as the "kid", even though my colleagues are only a few
years older. I think I've mostly overcome this, but TBH I'm still not certain
and I've been here for ~3.5 years in this role, and have been promoted to
senior developer. None of that matters to my coworkers.
Moving teams is a much easier way to shed that "newbie" reputation (a path I
decided not to take). Is it possible someone is creating this culture at your
work that locks people into their role when they join the team?
~~~
techsin101
What diminoten said
------
mirko22
Lately what I have been trying out is telling people: if you want to come for
an interview can you prepare a small presentation of one of the data
structures that you know about or like, or ideally hash tables (as they cover
wide range of topics).
If a person than comes in for the interview in about a week from when he is
told this and can’t present that DS to some depth I don’t bother to go on with
much longer interview.
I got lot of criticism for this ranging from who needs data structures, we are
not building libraries to this is too complex to ask a person.
But in my opinion if someone tells you what you are going to be asked on the
interview and you don’t even bother to prepare at least a little bit i can
assume me you will act like that at work too.
Why a data structure? Why not, you don’t necessarily need to like or know
anything about the domain you will be working in, same way you might not care
or like data structures.
This doesn’t apply to someone writing HTML maybe but for a senior programer it
should be easy to figure out how a simple linked list works, if you don’t care
about learning it I don’t care about wasting my time interviewing you.
I would like if some of you might give me thoughts on this approach.
~~~
varjag
Is it really possible to be a senior programmer without understanding a
concept of linked list? Mindboggling.
~~~
jcadam
I interviewed one who had a lot of trouble finding the largest integer in an
array... I mean, it really was a bozo-filter level problem, and it amazed me
how many people couldn't do it.
Personally, I hate white-board interviews, but I'd be relieved (and yet
annoyed at the same time) if you asked me to find the largest integer in an
array.
------
ravenstine
Why would any company _have_ do hire people and train them? Companies these
days don't really come to a hault if they're in need of a few developers, and
as the market is saturated, they can hold out indefinitely until they believe
they have a "rockstar."
According to my experience, a lot fewer companies than even a few years ago
are in a hurry to talk to you despite them having a job posting and you having
more experience. Most of them will take their sweet ass time, and it will take
even longer if they've replaced their HR or their own recruiting process with
a recruiting firm, and usually they'll engage in the same process of finding a
"rockstar" while taking their time because then they can fit in time for
another client and make more money.
Besides, the incentive for hiring the unproven developers is quickly dying off
with services like Triplebyte that can test and interview developers for you
for a nominal fee. Those who would normally be in charge of hiring at a
company never gets to the point where they've interviewed a bunch of people
and decides that they're tired of interviewing people and hires the candidate
who seems the most intelligent.
~~~
SmellyGeekBoy
> Companies these days don't really come to a hault if they're in need of a
> few developers
My small company has had to put our biggest contract on hold since July after
having 2 senior developers leave in quick succession. We're burning through
cash and have only just this week managed to find a replacement for one of
them. I've almost gone down the agency route (again) but the fees are pretty
hefty for a team our size (3 at the moment).
I'm not looking for rockstars, just someone with some familiarity with .NET
MVC.
Edit: We're in the UK, FWIW.
~~~
bvm
I feel your pain. Trying to hire perm in London at the moment is a real
challenge, even with a recruiter. Contractor rates are obscene (5-8x perm).
Recruiting and retaining is what keeps me awake at night over and above any
tech issues that I'm working on.
~~~
user5994461
How little are you paying your dev that it can be 1/8th of a contractor?
Even the cheapest dev I have ever met in London wasn't as low as 1/5th of the
most expensive contractor I have met.
~~~
bvm
Ah yeh, sorry, my mental maths strayed a bit into hyperbole there, it's more
like 4-5x
~~~
user5994461
How much are you trying to pay your dev that it can be 1/5th of a contractor?
It's no wonder you're having trouble finding anyone.
------
jarsin
I think tons of experienced people that love programming moved on because of
the "Google Interview".
You got all this experience and love making stuff for users, but you don't
know the "insert trick of the week" to solve the latest "elite" programming
question. Bye Bye. No more jobs for you.
~~~
hirundo
Turns out that Google isn't the only outfit hiring coders. I could never have
passed their interview but I've had a long career making a good living working
for companies that never asked me to write a self-balancing binary search tree
off the cuff.
~~~
jarsin
I run into the "Google Interview" almost everywhere now days.
~~~
scarface74
I can speak about my last 5 jobs over 10 years. Before that, I stayed at one
company way too long.
Job 1 - I had already been working at one company way too long. But I wasn’t
asked any tough technical questions. I explained both my professional
experience, and that I got my start as a hobbyist in 86 in 6th grade.
To be honest, with my programming experience I was overqualified for the job,
but I wanted to get into .Net and away from C so I took a high level entry job
as a .Net developer and it was basically a vertical salary move (wage
compression is real).
2\. I remember having a real simple written test that made sure I could write
FizzBuzz, knew the basics of .Net and knew how to design and use databases.
After that, I sat down and did pair programming with an IDE where I had to
make failing unit tests pass. Again I was still punching below my weight
class, but I was more concerned with learning than maximizing salary. It was a
10K bump and with a well known at the time Fortune 10 company.
3\. Basic technical interview making sure I knew .Net, JavaScript and
relational database theory. They asked a lot of questions about architecture
and my “whiteboard interview” was drawing out a relatively complex, scalable
system.
4\. Slightly technical but the main question I remember is “tell me what steps
are you going to take to create this software development department we need”.
I was interviewing for a Dev lead position without knowing it.
5\. “Here are some real world issues we are having with our architecture. We
are on AWS. How are you going to solve them?”
Yes I’m still supposedly a hands on developer.
~~~
kamaal
Companies that consider themselves special generally "Google Interviews". That
include bulk of the FAANG class companies, or any body who pays in the same
ball park.
There are also a huge alumni of the FAANG club, who have a vested interested
in doing "Google Interviews". Basically these people have spent so much
time(thousands of hours) that the only way they can justify it is by making
that interview process that way. Anybody who hasn't spent that kind of effort
is obviously beneath their station.
But I hope you see where this is going.
>>Here are some real world issues we are having with our architecture. We are
on AWS. How are you going to solve them
The "Google Interview" club likely won't ask you these questions. Their job
isn't to build software. It is to get good at interviews, if you are good at
interviews, interviewing is your day job, as practicing questions brings you a
raise/promotion/title-change every year.
Why waste time building software?
~~~
scarface74
And that’s why I’ve never had any interest in the Silicon Valley/Startup
culture.
I’m very happy in a major metropolitan area with a relatively low cost of
living anf a good relative salary working as an “Enterprise
developer/architect”.
------
cashsterling
I have run into some of these issues in the past when trying to get into
software.
I'm a chemical engineer / scientist with some programming experience/skill*
but I would suck at modern coding interviews because I don't program regularly
in my past 3-4 roles. I would have to get up to speed on the job, with pre-
prep before the job started, and I could develop into an awesome programmer.
Most job posting are written such that I am 99% certain my resume would just
be a 'fast pass' in the 10 seconds a recruiter might look at it. Oh well,
engineering is pretty interesting too... so I really can't complain much.
*My programming experience and background: \-- wrote Python/PyQt apps for parsing/analyzing/visualizing semiconductor device data, wrote sensor simulators/analysis tools in MATLAB and Python. Wrote image analysis routines in Java/ImageJ. I taught myself the languages and libraries and wrote correct and performant code.
I dabble in Python, JS, Julia, Rust, C++, various LISP's at home but I don't
have a lot of time or energy after 8-10 hours a day of engineering work.
I have done a fair amount of PLC programming and control system design in the
past. I also have 10+ years of post PhD engineering and physical science in
several different fields and all of the capability and skill sets required to
be successful in those fields.
Software gigs I have applied for in past generally have not even given me the
time of day... oh well.
~~~
mixmastamyk
Job mobility is at an all time low unfortunately, due to the incredibly risk-
averse environment.
------
vorpalhex
I think this is good advice... for larger companies with well established
patterns. One of the issues of a startup is that the founders are breathing
down your neck and often don't care a whole lot about "well designed software"
(or want to overbuild mvp products). Standing up to management, building best
practices, and being agile while still pushing back when needed against
founders/sales isn't easy but it's something engineers in a startup
environment have to deal with directly constantly.
A larger company has more layers of management to, hopefully, help filter that
out. Larger companies have best practices written down and figured out, CI/CD
pipelines in place, etc. It gives newer engineers more opportunity to succeed
in areas where they are strong, while letting them have mentorship in areas
they are new to them.
~~~
chii
> It gives newer engineers more opportunity to succeed in areas where they are
> strong
or just gives people who can fit a mould succeed, but doesn't allow someone
who is creative and can think outside the box to shine as existing beaurocracy
bogs down the smallest of change.
~~~
vorpalhex
One of the hardest challenges of being an engineer, especially at senior
levels, is protecting your peers from bureaucracy - but it is a doable
challenge. A corporate structure is like any other social structure and it can
be navigated and changed over time with persistent work and buy in from the
bottom up.
That being said, larger corporations do have more structure because they don't
want you to repeat the mistakes of the past. That can obviously go too far,
but if any constraint becomes "You're limiting my creative expression!" then
you've missed the goal.
~~~
chii
I find most corporate structures, levels of management and reporting
requirements (that aren't legislation based) all stem from the fact that the
"boss" can't trust the lower level guy to do their job and make decisions
without consulting someone higher on the hierarchy.
In a small startup, this isn't a problem, because decision making happens
immediately. In a large corp, you end up with bureaucracy this way.
How does other organizations that are large solve this problem? In the
millitary (at least, in the US, and other western doctrine millitaries), the
sqad or captain or ground level troop has a lot of freedom to make tactical
decisions, as long as that decision is to move towards the goal (or what's
normally called the commander's intent). Why doesn't this method work in a
corp. environment?
~~~
ryandrake
It works where there is a shared mission. But when team A wants help with
project 1 and team B’s priority is project 2, and they need approval from team
C because it touches their code, well, you get the bureaucracy and constant
escalation that you see in big software companies.
------
aphextron
>Don't hire like FAANG companies, don't use their best practices, don't use
their super oiled processes, don't play their same games with the same rules.
I'm back on the job market for the first time in 4 years and it seems like
things have changed a lot. I'm now stuck in this weird twilight zone where
every single company I talk to has the exact same process that they run
through the motions as if it were dictated from somewhere, and I've yet to
even have a genuine conversation. It inevitably leads to a "live coding"
session over the phone where I completely go blank and am unable to perform
because programming under a time constraint with someone staring at your
screen is absurd. Problems on the level of fizzbuzz become impossible because
my mind simply goes blank in those situations and I freeze up. It'd be nice if
someone would just give me a take home project where I can actually code
something properly and show off my skills, rather than conclude that I'm an
idiot who can't even code after 20 minutes of struggling with some toy problem
through my intense anxiety.
~~~
01100011
I understand the blank mind feeling. I was on a phone interview with Oculus 2
weeks ago and had a simple circular buffer question. I was expecting something
a lot harder, frankly, and hadn't practiced writing a circular buffer in a
while. I was doing fine until I got half-way through and wanted to refactor my
code. Since we're under time constraints I tried to push through and use my
existing design choice but it started the anxiety train rolling. In a couple
minutes my mind fogged over and I literally couldn't comprehend the code I had
written a few minutes ago. The interview ended poorly and I didn't get the
job. I finished the code after the interview, along with some test cases, and
the whole thing worked great. It wasn't that I didn't understand the problem,
I just couldn't implement it with someone breathing down my neck.
I interviewed with Nvidia a week later and almost had a similar issue but in
that case the interviewer sensed my reaction and managed to talk me through
things. I managed to recover and now I'm scheduled for an on-site.
~~~
superqd
Good luck
------
nalipp
Passion can't be measured with credentials so why do companies keep screening
for them?
I spent the last three years studying as a hobby and as a full time student
learning multiple frameworks, front-end and backend, and additional
technologies that made me curious like Vim and I can't even land a single
technical interview.
After coming to the Bay Area, I was thinking my github portfolio and
communication and networking skills would at least get me in the door to prove
myself.
Once coming I found out I would need to learn CS topics to get past the
technical interview and that React would be a good entry point for a first
job. So I left and studied another 6 months before returning. The second time
I got a part time job as a coding instructor at a bootcamp because I have a
history in teaching, but still struggle to get in the door for engineering
interviews.
Nobody takes me seriously without credentials, I always thought that in a
technical interview people would be able to figure out where you stand and not
need credentials. The problem is companies get flooded with resumes so they
build automated software to screen the best candidates but passion can't be
screened.
------
joshfraser
At Origin, we've hired multiple people without computer science degrees and
whose resumes you would never pick out of a stack. One of our key players, for
example, was a commercial real estate broker who taught himself how to code.
The reason we hired him is that we're a 100% open-source project and he just
started contributing. And he turned out to be really great. By the time we
gave him a fulltime offer, he'd had several months to demonstrate what he
could do. For anyone who is struggling with not having the right background or
degree, find an open-source project and start contributing. It's a great way
to prove yourself and you'll learn new skills and make awesome new connections
along the way.
~~~
cure
That makes sense. It's also how I look at hiring. People who have made good
contributions to our codebase (arvados.org) are first in line.
Being able to look past the resume and pick up good people who don't fit a
traditional pattern is definitely a skill. Business people tend to not
understand this because they are stuck on matching patterns. In other words,
don't let non-technical people be in charge of hiring developers :)
------
apeace
I get the sentiment but I have two problems with this.
1) Different stages of companies require different hires. When you're starting
out, find the hungry ones. They'll get better at programming if they care, and
they'll build a lot of stuff. Once you have real customers and are growing
like crazy, hire experienced people to help you scale.
2) There is a difference between the "interview process" and the screening
process. I agree that thinking about "Can this candidate do the job?" and
"Will this candidate be motivated?" is the best way to go. But I don't have
time to sit down with the ~200 junior engineers who sent me their resume on
Indeed in order to figure that out. Hence, I filter by 5+ years of experience,
and track record of working in multiple programming languages (don't screen
specific languages though). Are the metrics perfect? No. But in my experience
they weed out the people who are too junior to be successful on my team.
Interestingly, I think that if my company were bigger we might go back the
other direction. Once we're on a stable path, and we have a few senior
engineers who want to go into management and/or do some mentorship, we can
afford to bring on more junior people and help them grow into great engineers.
There are many stages of companies, startups especially, and in some
situations it makes perfect sense to use some plain old metrics.
~~~
ralmidani
"Years of experience" is a very misleading metric. Someone could have 2 years
of experience but be more motivated and/or have worked in a more demanding
environment than someone with 5 years of experience.
seniority !== skill
~~~
apeace
Which metric do you use which is never wrong? Or do you perform a full-on
interview for every person who sends you a resume?
~~~
ralmidani
Of course not. But finding talent is far more subtle than "how many years has
this person been in industry?" That doesn't account for passion, motivation,
how much they learned and accomplished on the job, etc.
------
chrisseaton
I don't see how 'hire people who aren't proven' can be compatible with 'can
this candidate do the job'. You want to know if they can do the job, but you
aren't interested in any proof for that?
~~~
sharemywin
30-50%(used to be 90%) of any programming job is doing crap you didn't know
how to do before.
If they have a proven track record of figuring things out experience is
minimally valuable.
~~~
chrisseaton
> If they have a proven track record of figuring things out experience is
> minimally valuable.
I can't understand this point of view at all.
When Google wanted to build the world's best JS JIT they didn't hire anyone
who was generally able to figure things out - they hired the person with the
most experience in building dynamic language JITs in the world. Experience is
everything! Experience is knowing which rabbit holes to not go down, knowing
who to speak to when you need help, what ideas haven't been tried yet, etc.
~~~
camel_Snake
I think it's totally unrealistic to compare the average coding gig with
writing a custom JIT from scratch for Google. In most businesses the
difference between 'very good' and 'world class' won't have a large enough
impact to spend the extra resources getting that rockstar.
And then there is the issue of actually retaining top talent...
------
6stringmerc
Unmentioned Important Addition to Headline:
...because if you are trying to hire people that are proven, you will have to
pay them a fair market rate.
~~~
ttoinou
> fair market rate
Or is it that people that have not been proven before have a lower market
rate, justified (so "fair") by this lack of pre-validation ?
------
lvh
There's a critical kernel of truth here that I want to point out: work-sample
tests. The longer I spend building teams, the more I am convinced: there are
people who do WSTs as a way to qualify applicants, and there are people who
just aren't serious about hiring.
It's easy to write a WST for simple things, like "can you literally write a
computer program that does this trivial straightforward thing". It's hard to
write WSTs for things that feel fluffy, like "can you manage a team". But
here's the thing: as long as that fluffy thing feels fluffy, what that really
means is you haven't bothered to figure out what success looks like for that
role, and you couldn't even evaluate that person let alone hire for them.
There's a company in Indy called Woven
([http://www.woventeams.com/](http://www.woventeams.com/)) that'll do it for
you, too. I have no relationship with them other than that they're nice people
who are trying to unfuck hiring.
~~~
throwawaymath
It would be great if you could convince large tech companies to use work
sample tests, but I just don't see that happening. So in that sense maybe work
sample tests can be a differentiator attracting candidates that don't want to
spend a few weeks doing Leetcode prep every time they look for a new job.
On the other hand work sample tests also have drawbacks. I don't know if
they're actually that much better than regular interviewing methods; I think
they just contribute an orthogonal signal instead of a stronger one. I don't
feel I can cheerlead them as much as you do in your first paragraph.
I think in an ideal world companies would allow candidates the option of
choosing either their work sample or their resume-blind, standardized
interview gauntlet. People with a lot of interviewing anxiety could self-elect
a work sample option. But if you impose a work sample on every candidate I
think you'll reduce your pool of available hires.
I was offered a work sample test recently and was told to spend about a week
on it. I started working on it a little bit the first day and really enjoyed
the exercise. But I was also interviewing with at least five other companies
at the same time; I simply didn't have the time between work and traveling for
onsites to really commit to the work sample.
------
75dvtwin
There is definitely economic incentive for a busines to find 'diamonds-in-the
rough', or there abouts, so to speak. (
[https://idioms.thefreedictionary.com/diamond+in+the+rough](https://idioms.thefreedictionary.com/diamond+in+the+rough)
)
The traits to look, suggest:
'lack of' of prestigious education background,
living in non-metropolitan area,
perhaps somewhat muted self-promotion skills
genuine and _continuous_ interest in the particular field
\+ all the other soft-skills (team work, work ethics, respect for others,
etc).
But I think to enable long term, mutual benefit between the business and
employees, the business must be able to place itself, virtually, in the
position of employee.
And ask: in addition to salary, why would the employee continue with me?
I feel that aspect is rarely discussed, written about.
I made some mistakes in my personal career development. And the most
significant ones where due to me believing that the companies (senior
managers) I work for, actually cared about my aspirations.
So, for myself, I had adapted this career management strategy, that I picked
from lawyers:
'Up-or-out'
[https://en.wikipedia.org/wiki/Up_or_out](https://en.wikipedia.org/wiki/Up_or_out)
Basically, I would not stay in one company for more than 3 (max 5 years), if I
do not make meaningful incremental career growth (that also includes
compensation growth).
Following that, even though late in my career, helped me to get recognition,
better relations in the industry, as well as better monetary compensation.
------
linsomniac
I'd add the caution that you should have clear criteria for what represents
success in the job, and the timeline. If the candidate doesn't meet those
criteria in that timeline, you both should agree that it isn't working.
My previous job was running a small Sys Admin consulting company. We hired 4
people over the years who definitely fit into the "not proven" category, with
very mixed results.
Two really struggled: One of them was "ok" but needed a ton of management, the
other never really got a basic level of proficiency despite spending most of a
year "studying" and working at the elbow of various masters. One worked out
really well and was a great worker. I feel like there was another one, but I
can't dredge up the specifics.
The proven people on the other hand were mostly rockstars. The one that wasn't
was largely due to my mismanagement of them.
So: Yes, absolutely hire the unproven. But have a plan.
------
crdoconnor
>Google's interview best practices strictly focused on algorithms and data
structure questions won't help you in your interview process.
They mostly don't help because they bear no resemblance to what 99% of
developers actually do, even at Google.
Realism in dev job interviews is criminally underrated.
~~~
maybeiambatman
Internships are probably the best job interviews.
~~~
chooseaname
Yeah, but that only[0] works for entry level developers.
[0] I'm sure there are exceptions.
~~~
maxxxxx
True. Contracting is sometimes also a way to find good people.
------
getaclue
As someone who is currently looking for a job I can definitely relate to the
troubling HR in software field. What I started doing is working on my own
content to help recruiters decide if I am what they are looking for. I added
more information about ME, what I am passionate about, what I was passionate
about previously, and what I am looking for to my about page
([https://getaclue.me/about](https://getaclue.me/about)).
This way, recruiters can decide if they like me or not. Another thing I
started doing while reaching out to companies is saying that I am open to 3-6
months "tryouts" if you wish.
I feel like we should have more of those. I also realized that social media is
very powerful for finding your way in 2018. Gone are the days of wanting to
work at a Fortune N company because they are doing great work. Most of those
unicorns turn into corp machines that they were trying to stay away from.
Another thing I started doing is looking for people that share my passion.
Anecdotally, Forums, IRCs, Telegam and the such communities are definitely
booming once again thanks to the dissolution of privacy.
One challenge - esp if you are someone who cares about software and software
engineering field - you have to do A LOT more work on your own. I work full-
time and I work when I get home. I don't see this changing.
EDITed to add some \n
~~~
vidanay
If you are in the Frankfurt Germany area, I have a position open there for a
C# developer.
~~~
getaclue
Thank you for the opportunity but I am located in Toronto, Canada.
------
lordnacho
I find this is incompatible with the list of 4 questions:
> Cut luck out of your system.
You can't decide to do that. If you're a small company, you are only sampling
a little bit and luck will either help you or hurt you. If you are large, the
law of large numbers will give you something like the average. You can't
decide if you're small or large.
As for finding people who are proven, it's up to you what level you are after.
If you want anyone who'd played division 3 football, you have a reasonably
large number of candidates. If you have a left back who's played Champions
League and is under 25, you have a small number of candidates.
Try to go with larger sets, because then the LLN will help you. Don't ask for
anyone who's played any sport for your football team, just anyone who has
played football to the level you need.
Another important thing to think about is how to work with the great mass of
ordinary people. At some stage if things go well, you may have to engage with
people who are not wanting to spend 80 hours at your office, or who don't
spend their weekends contributing to exactly the projects your firm is
interested in, and who maybe aren't even all that interested in what you do.
Motivating such teams has been a major value-add for a large number of
household names.
------
slfnflctd
My initial interest in programming started before 1990. Due to an insane
number and variety of mostly personal factors, the farthest I ever got was a
couple ancient javascript/HTML experiments, a few small-to-medium side
projects in MS Access and some intermediate router configuration. I've
primarily just done tech support. I've performed well in the jobs I've chosen,
though, because I get along well with others and tend to be loyal to the
organization, even if I know this is sometimes foolish. It's simply how I'm
wired.
Despite all the time that's passed and all the things I've dipped my toes
into, I feel no closer to choosing a language, framework or tooling. I have no
friends who are serious developers. I'm good at learning new things, but it
happens slowly. It would probably take me 6 months to a year of dedicated
daily time & effort just to ramp up on any one particular sub-technology
(which, considering that I'm a rideshare driver to make ends meet, is a
challenge)-- and there is no guarantee it would help me find work, since I
have no idea who's hiring for what, or what those jobs are actually like.
All that being said, other than 'professional entertainer' (haha), a software
developer is all I've really ever wanted to be, and I think with enough time
invested the right way, I'd be really, really good at some aspect of it. I
built myself a custom checking account database to better predict its future
balance, and I'm happy every time I interact with it (except when it crashes
for reasons unrelated to my code). The biggest question of all remains,
though: Is it even worth the trouble to try, because would anybody even
consider hiring someone like me?
~~~
onemoresoop
The best place to learn is when you're hired as a software developer at a
company. It doesn't matter if the company is good or not, you'll be forced to
learn to make the product work. Change a few jobs and some years of tinkering
and you'll likely get the fads, buzz, experience, etc.. Best place to start is
as junior dev and quickly move up. Learning at home will never give you work
experience since the 'real world' environment is not replicable at home.
------
chadash
This advice seems kind of crazy to me. I think you want someone who is proven
in some skill-set, just not necessarily the exact one you are hiring for [1].
In software engineering, many skills are easily picked up. If you know C++,
Java probably won't be too hard to learn. Angular and React aren't all that
different. Even backend and frontend development have a lot of overlap.
Many companies want experienced and proven hires and I think this is
completely reasonable. If you can afford it, would you rather have Lebron
James on your team or a rookie who shows some promise? However, I take issue
when companies get too specific in their requirements and exclude candidates
for not possessing skills that are easily learned by someone who is proven in
other areas.
[1] The exception might be junior developers who you are taking more of a risk
on, but also get paid less as a result. However, there are costs to training a
junior developer including paying a salary while they get up to speed and
using up experienced devs' time to train them.
------
tenpoundhammer
> 3\. Will this candidate get along with coworkers?
I think this 'getting along' is often misinterpreted as become best friends. I
think a better way to state this question is
'Will this candidate be able to have successful working relationships?'
There is no reason you can't have a wide variety of people -- that would never
choose to hang out with another-- working successfully together.
------
benkovy
As an unproven, but extremely passionate and motivated person (who is
literally 2 hours away from going into his first Software Developer
interview), I found this calming. If I don't get this position I really hope I
find a company with the same ideals portrayed in this article.
~~~
sound1
Good luck!
------
TictacTactic
As a dev straight out of university this blog resonates with me. I know I
don't even come close in knowledge to a senior dev so in interviews I try to
emphasis that I have a hard working personality, positive attitude, strong
desire to learn, etc. Traits I think a company would really want if a dev was
lacking in the specific skills. Sadly it feels like it always falls flat and I
end up coming across as naive.
I think mainly the problem is rooted in employers cynical approach to hiring
employees. It's hard to evaluate people on less measurable data points when
you don't trust them. It's easier to trust tests with easily quantifiable
results rather than those with grey areas.
------
blauditore
The problem is that those things can be faked during an classic interview by a
good actor (or "salesperson"). It's much harder to fake problem solving skills
on a whiteboard.
The pair programming approach is not terrible, but might be problematic if
skill fields (technologies, environment) of interviewer and candidate are not
perfectly aligned. So either the candidate will have to work in the company's
setup they're not familiar with, or the interviewer will try to follow
something they don't easily understand.
------
gwbas1c
One of my filters when phone screening is to ask simple questions that the
candidate should know given the stated experience. I try to use topics that
will separate out someone who's "faking it" versus someone who has the deep
(or shallow) knowledge that someone with XX years experience should know.
For example, (back when ARC was new,) I'd ask someone with 5-8 years objective
C to explain how autorelease works. (If you programmed in Objective C without
ARC and didn't know about Autorelease...)
Or, for Java and C# I ask some questions about exception handling. It's a very
simple concept that a lot of novices screw up. (If someone with a nontrivial
amount of C# or Java can't explain some exception handling basics...)
(Basically, my pattern is to get the candidate to discuss some well-known
details about memory management or error handling in a language that the
candidate states experience in. Any competent programmer should be able to do
this.)
I then ask some more theoretical questions that are relevant for the kind of
programming needed for the product. These are the kinds of things that someone
who has the experience needed should know without thinking too hard about the
question. If someone can answer with a lot of hemming and hawwing, that's okay
too. Someone who just can't discuss this kind of theory really isn't capable
of the job... Or learning the job.
------
cbg0
Sometimes it feels that even being "proven" isn't enough for a lot of
companies that expect you to happen to know the six programming languages they
put in their job ad, because if the ones you do know don't match up the
recruiters will assume you're not a good fit regardless of your years of
experience.
I do agree though that YMMV, and some companies are better than this and
accept that you might be capable of picking up new languages and technologies.
------
jkingsbery
For what it's worth, I work at one of the large tech companies mentioned. I
would disagree with some of the characterization. For example "Don't do
whiteboard coding on riddles or puzzles" \- I think this is great advice,
because questions like that are frowned on where I work too! Generally, the
coding questions we ask come from real world applications.
I've also worked at a startup, and I think this article misses something that
really could be helpful: there are different skills needed to work at most
large companies and most start-ups. At most early-stage start-ups, scaling is
not a problem. There are examples of companies that had to scale quickly, but
for the most part life at a startup is about finding where your market is. I
think that start-up hires also need to be more flexible: there's more room for
specialization in larger companies, whereas in smaller companies, engineers
that focus on one or two things can be pretty disruptive as the needed work
shifts.
------
tombert
I feel like I tend to value theory more than most people, and when I
interview, I tend to ask extremely theory-heavy questions that I think are
typically pretty difficult.
However, I don't usually care if they can actually work out the answer in a
short amount of time, as anyone can probably find an answer to a question in
about fifteen seconds on pretty much any search engine. What I measure is how
well the person asks questions.
The way I figure it, it's far more important that the worker is good about
unblocking themselves; you can't expect someone to have memorized every
algorithm ever, but it's not unreasonable to expect them to bother someone who
knows a bit more about the subject when they don't.
Just as an FYI, I don't have any fancy credentials, or really any
qualifications at all, so it's not like I'm pushing some kind of
MIT/Harvard/Berkley ridiculous agenda onto people.
------
notananthem
I am very skilled, highly educated, and interview poorly. When companies hire
me, I get big promotions, leadership pulls me in, etc. This is because I'm not
"an engineer" or "a programmer." The departments I work in obviously
know/recognize me but the business doesn't know the job function really, its
starting to be a more recognized area. The fact that I got as far as I did is
hilarious, and I know how to find a lot more people like me at any time, and
they'd never have gotten any of that value without hiring people like me. I
also don't know how you would discern a "fake" me from me, without some of us
on the interview. My interview was conducted by some highly qualified people
and a few not qualified people, so I knew I had room to grow because there was
a lot of dead weight.
------
fsiefken
Very good points, being on the lookout myself I realize this also works the
other way around. I need to be good at answering:
* Can I do the job and can I communicate my weaknesses and strengths?
* Will I be motivated?
* Will I get along with the people?
* What will I be 3, 6, 12 months from now? What do I want to achieve?
* What's my motivation and attitude?
* How do I learn?
* How do I work through blocks?
------
remote_phone
It’s a great sentiment and I would love to hear this blog poster’s success
with his system that he described. Does he have actual metrics as to how
effective his description is, or is this just another example of someone on
the Internet making an edict with no data backing it?
------
amorphous
I recommend "Why Employees Are Always a Bad Idea" by Chuck Blakeman. It
describes how to build a company where everyone is a stakeholder and not a
child that needs constanct watching and stupid rules in order to function. No
titles, no working hours, unlimited vacation time, you hire the whole person
(not just the BS smile at work part). You work together because you want to
find and make meaning. No CVs and skills are far less important than
personality and fit (skills can always be learned).
The book is the reason I'm building my own company because this is how I want
to work and live (I used to work for a company like that, but unfortunately,
due to personal circumstances the founder had to split and the company
disappeared).
~~~
ttoinou
Great idea, but then couldn't they replace you and what you're doing (since
they won't be your employee they'll be business partners) ?
~~~
amorphous
No, it was the unique combination of people that made the company. Once two
key people left, everything fell apart.
------
j45
Having hired folks who weren't proven, most didn't work out. However, the ones
who did work out were great.
One difference that stood out beyond courage and attitude is how engaged in
actualizing one's potential the candidate is.
Everyone has potential. The person doing the hiring may see it. The candidate
may, or may not.
The only thing that mattered in the end was the candidates ability to
actualize their own potential beyond what they may see in themselves, given an
opportunity.
The questions "What do you build/work on when no one is looking" is one
interesting question to elicit a sense of their attitude towards their
potential, and if they have the courage to undertake building/learning
something for the sake of learning.
~~~
sparrish
You have to kiss a lot of frogs to find the prince this way. It can be very
time consuming. If your company has more time than money - it's a reasonable
strategy.
~~~
j45
Agreed on kissing a lot of frogs. I wasn't recommending it, or not, just what
one experience was. Over time, it's less useful, but it's important to still
remain as open as possible.
------
jaclaz
On the actual job ad, and how it sounds, I like to cite:
[https://tudorbarbu.ninja/message-to-
recruiters/](https://tudorbarbu.ninja/message-to-recruiters/)
>We’re looking for a person with more than 100 years of experience in software
development, coding everything from BIOSes to cloud applications, knowledge of
all past, present and future operating systems and setting up secure networks.
The applicant must also be able to juggle up to twenty balls and read
hieroglyphs, be fluent in Swahili and dance like Michael Jackson (especially
moonwalking – nice to have at corporate Christmas parties).
------
AngeloAnolin
Applicants in general I feel are always to be on a disadvantage because _most_
companies leave the hiring to the HR Department, who have very little clue on
what they need to actually be looking. Most of the time, they would have a
checklist, where if the person does not meet enough of their threshold, that
person is immediately passed off for the next applicant.
Add the process that some folks do to screen candidates - lots of times, these
are not objective in nature and tend to skew towards most applicants who seem
to have a very impressive profile made up of fancy words and half truths.
------
nasalgoat
I dunno, I tried to hire based on potential, and I just let them go after six
months of him failing to grasp even the basics of the job. He did fine in the
interviews but ultimately you cannot fake experience.
~~~
wccrawford
We've hired multiple people based on potential. A few of them have worked out
astoundingly well. More than that have quit or been let go for not being able
to handle the job.
It's pretty demoralizing to let someone go. It's pretty annoying to have them
quit in the first week. (Or even the first day!)
But it's pretty awesome when they work out and you get to watch their skill
grow over time.
------
duxup
I wish this was a thing, but it's not.
I worked for a company that did some layoffs of some good people. They're
capable, but their bullet points don't match many jobs .... so they're looking
for and endlessly long time.
These are great people more than capable of learning. At their previous jobs
as a team they did more work than teams 3x their size.
I know a few places that turned them down in favor of folks who matched the
bullet points... but not the job.
It's difficult to watch as I see news of places "desperate to find workers"...
but refusing to hire good people.
------
onion2k
I read " _You must have 10 years experience of <tech>_" as shorthand for "
_Have you stopped learning new things? Come and work here._ "
~~~
chrisseaton
That's ridiculous. Why does learning new technology mean you need to stop
using the old technology?
~~~
onion2k
It doesn't, obviously. My point was about how companies advertise jobs. A
company advertising a job with a requirement of 10 years experience in a
technology wants someone who's able to tackle pretty much any problem from the
outset. _They 're not open to people learning._
------
mrmrcoleman
A major GPS navigation company hired me for a job I wasn't ready for in 2006
when I was 24 years old. It was my first job at a big company.
I'm pretty sure I got through because they were growing way too fast and I
slipped through the process somehow.
Being dropped into the deep end and surrounded by smart and experienced people
was incredible for me. It took me a while to get up-to-speed but when I did
I'm sure I more than repaid their investment.
------
honkycat
I can't disagree with this person more.
More than anything have learned that education and training are hugely
important and hiring to train leads to mediocre staff who think their two
years of development work stack up to your 4 years of college and 6 years of
professional experience.
They take forever to start writing productive code, if they ever bother
leaning at all.
I will never hire someone without a degree or equivalent experience again.
Even for Jr. roles
~~~
nil_pointer
This strategy may weed out duds, but it will also occasionally overlook great
people. One of the brightest engineers I know dropped out of high school and
got a GED, but was obsessed with tech and practiced building things daily.
He's 30 and a VP now.
------
grigjd3
This feels a bit like a strawman. I've interviewed with FAANG companies and I
run interviews and I don't see these puzzle questions. I see people looking
for basics and looking into problem solving and hints that candidates have
done good work before. Granted, I've seen that urge to ask puzzle questions in
some quarters, but I've regularly seen that ignored. Maybe though I've been
blessed..
------
maybeiambatman
The author makes some good points. But, I wish they had offered some ways to
test the dimensions they speak of: courage, attitude etc.
------
warmind
So, with this in mind, can any more experienced professionals say how you can
stand out as an 'unproven' new grad/dev?
I've got work experience that isn't a tech internship (network support at a
major uni), and code projects, but without that internship (I did research
instead), it seems I and people like me are constantly at a disadvantage.
------
willart4food
Yup! Tom Peters - of "In Search of Excellence" fame - advocated the same
mantra back in the late 90's. Yes I am that old.
It makes for great copy. Really.
But then, when push comes to shove, the old search for a lego piece that fits
with all the other lego pieces continues, so that there's the alibi.
It reminds me of IBM's "nobody ever got fired for buying IBM".
Life continues.
------
contingencies
In all fields, search for hacker mindset: curious, intelligent, self-
motivated, capable of independent R&D and execution.
------
calferreira
I don't think that the hiring process in terms of what you need is that hard.
You just need to hire people that match your company culture, matching the
skill set you need for a given salary range.
Hiring someone who blends well in the company culture is half way there.
Now, if there's someone that matches this line of thinking in the pool of
talents, is another story.
------
microtherion
> "Don't hire like FAANG companies"
I work for a FAANG, and we don't hire like that either. The motto is often
"hire for potential, not track record".
The dark pattern lurking in that is age discrimination: A motto like this can
easily be taken as an excuse to completely dismiss track record, or even
consider it detrimental.
------
arikrak
Nice post, I think it can be more difficult for less-experienced (but
talented) people to find jobs than it should be. However in some ways the
FAANG companies are better in this regard since they hire many people out of
college and strive to have an objective hiring process that can be passed by
less experienced people.
------
simonhamp
Hope you don't mind me sharing some thoughts I wrote on this topic that are
similar to the linked article...
Hiring is broken.
[https://medium.com/@simonhamp/a-new-way-to-hire-tech-
talent-...](https://medium.com/@simonhamp/a-new-way-to-hire-tech-
talent-942ca2e7db8f)
------
OliverJones
Joel Spolsky figured this out a while ago.
1\. Smart.
2\. Gets things done.
Read this. [https://www.joelonsoftware.com/2006/10/25/the-guerrilla-
guid...](https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guide-to-
interviewing-version-30/)
~~~
lvh
There are good tidbits in that post, but the way you measure "smart" is awful
and almost entirely subjective. The conversation "just flows"? Really? Gee, I
wonder if that's likely to make you hire people who have the same background
as you.
I agree that the "gets things done" part is important! But I think you should
measure if they can, in a controlled environment, instead of just going off
the resume and seeing if it has worked out in the past. Lots of total dipshits
manage to ride on the coattails of successful teams.
------
rademacher
Seems to me that at some point hiring moved away from hiring based on
potential to hiring based on pure skill and how many boxes the candidate has
checked. Companies seem to just want people who can turn the crank, which
doesn't sound all that appealing.
------
goodoldboys
The author hits on a lot of great points here, that I believe speak to a
larger theme: there's a ton of undervalued programming talent out there and
finding that talent can give your company a huge leg-up on the market.
------
tylerchilds
I think the manhole covers are round because they're really heavy and it'd
probably be tedious to orient a square or any shape with multiple sides to
match up perfectly every time you put them back on.
------
m3kw9
If you look at NFL as an example, even with such intensity of the scouting
methods, that will tell you how hard it is to find talent
------
rllin
not only does the industry not have a labor union, there's a culture of
passion which is almost like having a negative union.
any time passion is involved, you are being paid less than market. cash is a
more liquid currency and can in fact buy passion.
------
zyngaro
So to fix hiring let's hire people on even more subjective bases !
------
kkotak
Don't forget, FAANG are looking for minions that fit the assembly line in 99%
of cases. Not original thinkers or people who are multi-talented, wearing many
hats.
------
whorleater
or be like Netflix and _only_ hire proven people and pay them a lot
------
patientplatypus
What amuses me more than anything else is that this dystopia was created by
software developers. Oh everyone is going to a bootcamp and saturating the dev
job market? Maybe you shouldn't have "disintermediated" every possible job
from accountant to truck driver. What's good for the goose is good for the
gander.
| {
"pile_set_name": "HackerNews"
} |
This Video Makes Bill Gates Look Cooler Than Steve Jobs | Farwell video @ CES - iamelgringo
http://gizmodo.com/341472/this-video-makes-bill-gates-look-cooler-than-steve-jobs
======
shayan
its a funny one, but it'll take a lot more than a video for him to look cooler
than jobs
| {
"pile_set_name": "HackerNews"
} |
Visiting the Anderton Boat Lift on My Narrowboat (2017) [video] - CaliforniaKarl
https://www.youtube.com/watch?v=gJDqGimPc9Q
======
gambiting
If you find this interesting, I'd really recommend learning about Falkirk
Wheel - it's an absolute marvel of engineering and the only rotating boat lift
in the world. I've visited a month ago and it was fascinating.
[https://en.wikipedia.org/wiki/Falkirk_Wheel](https://en.wikipedia.org/wiki/Falkirk_Wheel)
~~~
lelf
See also: Krasnoyarsk dam inclined plane
[https://en.wikipedia.org/wiki/Krasnoyarsk_Dam#/media/File%3A...](https://en.wikipedia.org/wiki/Krasnoyarsk_Dam#/media/File%3AInclined_plane_at_Krasnoyarsk%2C_on_the_Yenisie_River.jpg)
[https://www.youtube.com/results?search_query=Krasnoyarsk+Dam](https://www.youtube.com/results?search_query=Krasnoyarsk+Dam)
------
walrus01
For those not familiar, narrowboats in England need to be VERY narrow.
[https://en.wikipedia.org/wiki/Narrowboat](https://en.wikipedia.org/wiki/Narrowboat)
There's some really fascinating videos on youtube from people who live in them
full time, with walkthrough tours of living room space + kitchen + bedroom.
~~~
PaulRobinson
Having lived on one as a child, I can assure you that it's not quite as
idyllic as it might look.
------
theluketaylor
Also possibly of interest to people is the Marine Railway on the Trent-Severn
Waterway. Boats sit in slings or directly on the car floor and transported
across the 18m elevation change.
[https://en.wikipedia.org/wiki/Big_Chute_Marine_Railway](https://en.wikipedia.org/wiki/Big_Chute_Marine_Railway)
------
Neil44
The motorways of the industrial revolution! And lovely to cycle down these
days.
------
jimmytidey
I don't want to turn into this man.
I'm going to turn into this man.
------
jtms
That was so thoroughly and charmingly English, but what an amazing feat of
engineering for the time it was built!
| {
"pile_set_name": "HackerNews"
} |
1950s U.S. Nuclear Target List Offers Chilling Insight - georgecmu
https://www.nytimes.com/2015/12/23/us/politics/1950s-us-nuclear-target-list-offers-chilling-insight.html
======
mikece
Chilling that "population" is listed as a target but I can't help imagining a
scenario where all of the primary and secondary targets -- military,
government, industrial, agricultural -- are taken out with strikes that
limited the amount of collateral damage and maximized the number of survivors:
how many of those people will wish they had not survived the attack?
------
Jamwinner
[https://nsarchive2.gwu.edu/nukevault/ebb538-Cold-War-
Nuclear...](https://nsarchive2.gwu.edu/nukevault/ebb538-Cold-War-Nuclear-
Target-List-Declassified-First-Ever/)
As usual, nyt adds nothing but paywalled political babble, here is the
relevant link.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Software for helping visually impaired persons? - adamwi
My dad is slowly going blind due to macular degeneration (basically the nervcells in the retina slowly dies, starting in the macular region). He still has ok vision but starts to have real issues reading longer texts, long term he will go completely blind.<p>I'm now looking for software (preferably open source) that can help him when using the computer. Main use case is being able to retrieve larger quantities of information (e.g. emails, news articles, etc) either by magnifying some part of the screen or by reading out the text loud.<p>Current solution is using a large "analog" magnifying glas in front of the screen, feels like there should exist a better solution. Long term it would be preferable if the solution can also assist him with navigating the web using audio.<p>Did some googling but mostly ended up on pages with more generic tips on how to adjust ambient lighting in the room etc. Any tips on where to look?<p>Grateful for the help!
======
10dpd
What kind of computer does he have? Screen magnification software is built
into most OS's for free.
E.g. macOS:
[http://www.apple.com/uk/accessibility/osx/#vision](http://www.apple.com/uk/accessibility/osx/#vision)
Free screen readers:
VoiceOver (built into macOS / iOS)
NVDA: [http://www.nvaccess.org/](http://www.nvaccess.org/)
ChromeVox: [http://www.chromevox.com/](http://www.chromevox.com/)
~~~
adamwi
Thanks, looks like good free alternatives to start with! Any experience
regarding if macOS or Windows provide the better support? He is currently on
Win computer
------
Davidbrcz
There are screen readers (text to speech generators), braille displays.
I don't know what is the status for fingerreader (
[http://fluid.media.mit.edu/projects/fingerreader](http://fluid.media.mit.edu/projects/fingerreader)
)
Also have a look at HandyDV Linux ([https://handylinux.org/index-
en.html](https://handylinux.org/index-en.html)). The aim is to offer for
visually challenged and blind people an accessible computer. A french guy is
beyond it and there is a kickstarter-like compain to support it (french page
about it [http://linuxfr.org/news/financement-participatif-de-
handydv-...](http://linuxfr.org/news/financement-participatif-de-handydv-
linux-et-sa-machine-a-lire), couldn't find anything in english).
------
lovelearning
This article written by a blind developer was posted here some years ago. You
might find some good tips there about tools.
[Article]: [http://blog.freecodecamp.com/2015/01/a-vision-of-coding-
with...](http://blog.freecodecamp.com/2015/01/a-vision-of-coding-without-
opening-your-eyes.html)
[Discussion]:
[https://news.ycombinator.com/item?id=8965048](https://news.ycombinator.com/item?id=8965048)
~~~
adamwi
Thanks! Knew I had read it somewhere but could not find it when searching.
------
scpotter
You're looking for accessibility tools. Modern desktop and mobile OSes have
these built-in, Google [whatever OS] accessibility zoom.
Open source screen reader: NVDA (www.nvaccess.org) Industry standard screen
reader: JAWS (www.freedomscientific.com/Products/Blindness/JAWS)
| {
"pile_set_name": "HackerNews"
} |
Can Microbes Encourage Altruism? - tdurden
https://www.quantamagazine.org/can-microbes-encourage-altruism-20170629/
======
cronjobber
If _humans_ had parasites that would encourage them to be altruists against
their interest and for the sole benefit of their parasites—which is of course
implausible and hypothetical and absolutely not even possible—would they make
us downvote and flag comments on HN discussing the possibility of parasites
manipulating humans to be altruists against their interest?
~~~
ianai
The popularity of cats on the internet is at odds with your premise. They've
clearly infected many humans with thought changing microbes and yet
^H^H^H^H^H^H^H^H^H...what were we talking about?
~~~
Poodlemastah
Uhm, just saying, you guys ever heard of toxoplasmosis?
------
Terr_
Now I'm thinking of an old short story titled "The Giving Plague", by David
Brin.
[http://www.davidbrin.com/fiction/givingplague.html](http://www.davidbrin.com/fiction/givingplague.html)
------
plaguuuuuu
>The researchers then pitted two types of virtual microbes against each other
in the simulation. One microbe promoted altruism in its hosts, while the
second did not.
Replace microbes with anything that's heritable and the effect still plays out
in the simulation. With respect to the study's model, is there any difference
at all between bug heredity and DNA?
~~~
yorwba
The study is here:
[https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5241693/](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5241693/)
Their model has altruistic _α_ and non-altruistic _β_ microbes, that determine
the behavior of their hosts. In each generation, hosts are paired up randomly
to interact. Altruistic hosts reduce their fitness by a certain cost _c_ and
increase their partner's fitness by a benefit _b_. Additionally, each kind of
microbe has the opportunity to infect the other host with transmission
probability _Tα|Tβ_.
The proportion of α microbes increases when _Tα b > c (1 - Tβ) + (Tβ - Tα)_
Importantly, when there is no horizontal spreading between hosts (Tα = 0),
this is never true, so simple heritable genes are not enough for altruism to
dominate.
The quantity _Tα b_ essentially corresponds to the benefit accrued by
spreading to a host you have helped before, _c (1 - Tβ)_ is the cost you have
to bear if your host is not taken over by β anyway, and _(Tβ - Tα)_ is the
natural advantage of β over α.
------
yakult
There's this, there's the post the other day about encouraging 'openness' with
MDMA, there was the article proposing 'fairer' jerrymandering reform that just
so happens to shift all the votes in a single direction, plus all the
hagiographies of our tireless selfless social media censors, I'm beginning to
think there's a burgeoning industry working on novel side-channel attack
against the electoral system.
| {
"pile_set_name": "HackerNews"
} |
Chuck Norris Exception - criso
http://criso.github.com/ChuckNorrisException/
======
subnetvj
Here in India, we have a very popular actor, RajniKanth, whose awesomeness
occasionally goes down the line drawn by Chuck. Here are some cool RajniKanth
quotes:
Once Death had ‘near Rajnikant experience’
Rajanikanth can do a wheelie on a unicycle.
~~~
criso
Yeah , he's pretty epic <http://www.youtube.com/watch?v=HyuzNP_UP4w>
| {
"pile_set_name": "HackerNews"
} |
Ten Years of Purely Functional Data Structures (2008) - ColinWright
http://okasaki.blogspot.ru/2008/02/ten-years-of-purely-functional-data.html
======
andrewcooke
it mentions me! my one claim to fame... here's the review
[http://developers.slashdot.org/story/04/02/19/2257203/purely...](http://developers.slashdot.org/story/04/02/19/2257203/purely-
functional-data-structures)
~~~
DanWaterworth
I wonder what other interesting books are out there that just need someone to
review them on slashdot.
------
juliangamble
These ideas are one of the foundations of the Clojure language. Rich Hickey
mentions in one of his talks that this book got him quite excited. You can see
reference to it in Rich Hickey's Clojure bookshelf on Amazon
<http://www.amazon.com/Clojure-Bookshelf/lm/R3LG3ZBZS4GCTH>
------
ludicast
I took his PLT class at Columbia and TA-ed the class later. Best professor I
ever had. I believe he is now at West Point where students won't whine when he
hurts their brains.
Would like to see the book re-done with Haskell as the base (rather than
appendix) because that language is more "purely functional" and seems to
finally have the legs it needed back in 2008.
------
vowelless
His thesis is quite useful:
<http://www.cs.cmu.edu/~rwh/theses/okasaki.pdf>
------
igravious
Some questions. Background: I'm coming to the whole world of functional
programming via Ruby (via C).
Doesn't Clojure have purely functional (immutable) data structures? Whenever
this area is talked about I always hear Clojure mentioned. Do other functional
languages have libraries (standard or otherwise) of these? Are these data
structures implemented in Clojure itself?
I kind of see Ruby like a hybrid functional language. Could Ruby be used to
implement the data structures in this book, or would that be like a square peg
in a round hole? Didn't Ruby recently get official lazy evaluation?
Hum, so many questions, so little Google :(
~~~
noelwelsh
Yes, Clojure does have many purely functional data structures built in to its
core. I don't know the precise implementations, or if they're written in
Clojure or Java, but I believe they're based on the work of Phil Bagwell.
Every functional language implementation I know of has many of these data
structures available as a library somewhere.
You certainly could implement these data structures in Ruby, though it's
likely they wouldn't mesh as well as in a more functional language.
This Stackexchange answer has links to many papers if you're interested in
more:
[http://cstheory.stackexchange.com/questions/1539/whats-
new-i...](http://cstheory.stackexchange.com/questions/1539/whats-new-in-
purely-functional-data-structures-since-okasaki)
~~~
mtrimpe
Clojure was also the _first_ mainstream (non-PF) language to implement
Okasaki's as first class data structures.
I know Scala quickly followed suit, since I remember Odersky bragging in a
meetup that he went out of his way to find the scientists that improved on
Okasaki's work and based Scala's data structures on their work instead.
------
kenjackson
Great book. Although I'd like to see an "open source" version when you can get
all the code snippets expressed in the language of your choice. When I read
the book I wanted Haskell, but now I think I'd like to see it in F# or maybe
even C#.
~~~
tikhonj
It does have the Haskell versions in the appendix.
Besides, depending on your perspective, both SML and Haskell are basically
functional pseudocode.
~~~
kenjackson
There was something missing from the Haskell stuff, but I can't recall the
details -- it's been almost 10 years ago since I read it.
But when I am reading a data structures/algorithms book, I don't want to have
to deal with code (SML in this case) that I'm not super familiar with. Little
things get in the way. With modern software, and the ability to dynamically
swap out content, it seems a pity that we don't better tailor content to the
reader.
------
jph
Brilliant. I can donate $100 to get it updated and as a PDF.
~~~
binarycrusader
What is this referring to? I see nothing obvious that mentions a donation or
PDF.
~~~
jph
I'm offering to donate $100 to the author if he would like to update the book
as he mentions in his story, and I'd like to get the book as a PDF instead of
on paper because of the code samples and searching.
This is my personal gift to him because I value his ideas and I hope to learn
more. This may also inspire other people here to offer to donate, which will
be great if it happens.
------
oinksoft
(2008)
| {
"pile_set_name": "HackerNews"
} |
How Many Objects Can Be Juggled (1997) - mgdo
http://fermatslibrary.com/s/how-many-objects-can-be-juggled
======
jw1224
There's a fascinating video on this, which features the current world record
holder, Alex Barron — who holds the records for most balls juggled (11) and
most balls flashed (14).
See
[https://www.youtube.com/watch?v=7RDfNn7crqE](https://www.youtube.com/watch?v=7RDfNn7crqE)
------
avmich
> I hate to break it to you aspiring numbers jugglers, but no human will ever
> juggle 100 balls.
I hate it when the in first sentence author makes a categorical sentence and
then half a page later cites "a - acceleration of gravity = 9.81 m/s^2" as if
it's a some sort of fundamental constant :) .
Here is an example of juggling under simulated different gravity:
[https://blog.ted.com/athletic-machines-raffaello-dandrea-
at-...](https://blog.ted.com/athletic-machines-raffaello-dandrea-at-
tedglobal-2013/)
~~~
iiv
Also, he wrote:
"g = acceleration due to gravity = 9.81 m/s^-2"
m/s^-2 is not the correct unit for acceleration.
~~~
mark_edward
Yes it is. That's the SI unit.
[https://en.wikipedia.org/wiki/Acceleration](https://en.wikipedia.org/wiki/Acceleration)
~~~
iiv
No it isn't. That's not the SI unit.
[https://en.wikipedia.org/wiki/Acceleration](https://en.wikipedia.org/wiki/Acceleration)
TIP: Look closer =)
~~~
mark_edward
Oops, thanks, guess I have mental autocorrect turned on
~~~
iiv
Haha, no problem. It's easy to miss =)
------
cridenour
If you're looking for a great long form article around juggling, I highly
suggest Grantland's article[1] as one of my favorite "why did I just spend an
hour reading about that?"
[1] [http://grantland.com/features/anthony-gatto-juggling-
cirque-...](http://grantland.com/features/anthony-gatto-juggling-cirque-du-
soleil-jason-fagone/)
------
ColinWright
One thing that needs to be discussed is the accuracy, which is independent of
gravity. You can juggle at about 6 to 8 throws per second, so if you want to
juggle, say, 50 objects, each one needs to be in flight for about 7 seconds.
That gives 7 seconds of time to drift from the intended landing location (time
_and_ space) and that's a long time.
Even if you don't have to throw things really high, their "beaten area" grows
with time, and that's a quantity I'd be interested in seeing explored.
------
osteele
Also see (Claude) Shannon's Juggling Theorem:
[https://en.wikipedia.org/wiki/Juggling#Mathematics](https://en.wikipedia.org/wiki/Juggling#Mathematics)
Shannon also built juggling robots. (But not, to my knowledge, machines to
measure human juggling.) Here's a video:
[https://www.youtube.com/watch?reload=9&v=sBHGzRxfeJY](https://www.youtube.com/watch?reload=9&v=sBHGzRxfeJY)
~~~
imron
> Shannon also built juggling robots. (
I'm disappointed this Kickstarter project never received funding:
[https://www.kickstarter.com/projects/958028900/human-
scale-p...](https://www.kickstarter.com/projects/958028900/human-scale-
prototype-for-bugjuggler-a-car-jugglin)
------
taco_emoji
Here's the same paper without the maddening UX:
[http://www.juggling.org/papers/limits/](http://www.juggling.org/papers/limits/)
------
nsxwolf
I can't juggle. Not even in slow motion, with scarves. I've tried many times.
I'm convinced some people just can't "get" it.
~~~
patrickmay
I found scarves harder to learn. You can learn to juggle three balls in 30
minutes:
1\. Hold one bean bag in your dominant hand while standing, facing a wall,
about a foot away.
2\. Drop the bag. There, that's over with. (You'll be doing it a lot.)
3\. Pick up the bag.
4\. Toss the bag from one hand to the other, keeping your elbows loosely at
your sides, until you consistently have it arcing at about eye level before
descending to the other hand.
5\. Do that 100 more times.
6\. Hold one bag in each hand (drop them once, if you like).
7\. Toss from your dominant hand and, when the bag is at the top of its arc in
front of your eyes, toss the other bag. Catch them both.
8\. Repeat without pausing another 100 times.
9\. Hold two bags in your dominant hand and one in your other hand. Toss the
first, wait for the top of the arc and toss the second, wait for the top of
the arc and toss the third.
10\. You're juggling. Drop all the bags to celebrate.
~~~
jay-anderson
Definitely. I've had a hard time convincing people to do the required
repetitions. They end early and say that they can't juggle. The few people
that have, successfully juggle in a relatively short amount of time. Their
form isn't great and they can't keep it up for a long time, but they have a
great start.
A couple other exercises/tips worth mentioning:
\- Start with your non-dominant hand for 2 balls as well (alternate which hand
you start with).
\- Stand over a couch or bed to make it less costly to drop a ball.
\- Stand in front of a wall to notice when you're moving forward.
\- For the advanced: try two in one hand (much harder than 3). Will make 3
ball juggling easier.
~~~
Kiro
> try two in one hand (much harder than 3). Will make 3 ball juggling easier.
What does this mean?
~~~
scbrg
Not GP, but if I were to guess: Try juggling only two balls, but use only one
hand. Here's a video for demonstration:
[https://www.youtube.com/watch?v=9uMui692JHU](https://www.youtube.com/watch?v=9uMui692JHU)
------
spiralx
What I find fascinating is how there's a whole mathematical formalism for
describing juggling patterns called siteswap which allows patterns to not only
be described, but discovered. It's based on throw heights, with the basic
3-ball cascade being represented by 3 - a sequence of balls thrown so they
land 3 beats later on, repeated indefinitely. 531 indicates one ball thrown
high, one mid-height and then one passed directly across from one hand to the
other.
Basic intro:
[http://www.twjc.co.uk/siteswap.html](http://www.twjc.co.uk/siteswap.html)
Much more in-depth treatment here including synchronous and multiplex
patterns, ladder and causal diagrams and a bunch of proofs relating to the
notation:
[https://www.jugglingedge.com/pdf/BenBeeversGuidetoJugglingPa...](https://www.jugglingedge.com/pdf/BenBeeversGuidetoJugglingPatterns.pdf)
but you can always go here and try a few patterns such as 3, 441, 531, 504,
(4,4) or in windmill/Mill's Mess mode, 423
[http://www.gunswap.co/](http://www.gunswap.co/)
------
qop
Can more objects be juggled on the moon then on earth?
~~~
rtkwe
Yes, check the definitions of the author's theoretical max ball calculation on
the first page gravitational acceleration determines the 'hang time' of a ball
for a given value of hand acceleration. So holding everything else the same
the same person with the same ability to throw 9 balls at g=-9.8m/s^2 would
throw the balls higher on the moon giving them more room in their pattern for
more balls.
------
ErikAugust
I came here looking for a paper about OO and having too much state. But this
was interesting.
------
tambourine_man
I'm fascinated by these seemingly useless yet remarkably mind grasping
problems.
I remember reading that Feynman had a profound insight while calculating the
wobbling of plates being thrown on a ship.
You never know where a fertile mind can be taken by those aimless thoughts.
~~~
pvg
It was just the cafeteria at Cornell, slightly more prosaic than a ship.
------
lanius
I'm surprised that robots have not yet surpassed human jugglers.
[https://en.wikipedia.org/wiki/Juggling_robot](https://en.wikipedia.org/wiki/Juggling_robot)
------
logicallee
Could a styrofoam or other ball fall more slowly (like a feather) and
therefore allow more of them to be thrown?
~~~
jcranmer
The acceleration due to gravity is independent of mass. A 1cm ball made of
lead and a 1cm ball made of wood would both fall at the same speed.
What does make a difference is shape, due to differing amounts of air
resistance and lift.
~~~
tzs
The force from air resistance at a given velocity will be the same on the 1 cm
lead ball and the 1 cm wood ball because they have the same shape and size.
The acceleration from air resistance is the force divided by the mass, and so
the 1 cm lead ball has much less acceleration from air resistance than does
the 1 cm wood ball.
As you note, the acceleration due to gravity is the same on both balls.
The net acceleration, therefore, is larger for the 1 cm lead ball than for the
1 cm wood ball, and so the lead ball falls faster.
The difference in acceleration between the lead ball and the wood ball might
not be large enough to easily see if you just drop them a couple meters.
Try a ping pong ball and a lead ball of the same diameter (4 cm). That lead
ball would weigh 372 g. A ping pong ball is 2.7 g. At a given velocity, the
acceleration from air resistance would be almost 140 times greater on the ping
pong ball than on the lead ball. That should greatly reduce the height you
need to drop from in order to see the difference.
~~~
ctchocula
This doesn't sound correct to me. The acceleration due to gravity is the same
on both balls. However, since the lead ball weighs more than the wood ball,
the force of gravity on the lead ball is stronger than the force on the wood
ball (F_g = mg).
The effect of air resistance can be modeled by a different force. It is
typically modeled as a linear function of velocity rather than mass, and it
models the behaviour that the faster an object is traveling the more it is
affected by air resistance which acts in the opposite direction as the
direction the object is traveling in (F_a = -kv). By adding the two forces
together, you get a second-order differential equation that describes how the
object behaves (F_g + F_a = F_net = ma).
[1]
[https://oregonstate.edu/instruct/mth252h/Bogley/w02/resist.h...](https://oregonstate.edu/instruct/mth252h/Bogley/w02/resist.html)
~~~
tzs
> The effect of air resistance can be modeled by a different force. It is
> typically modeled as a linear function of velocity rather than mass [...]
That's the key. The air resistance force is a function of velocity [1], and it
is not a function of mass. Two bodies that are aerodynamically identical (same
shape, size, same boundary interactions with the air) experience the same air
resistance force at a given velocity, regardless of their masses.
As you note, the falling body has two opposing forces. Gravity, which is
proportional to the mass of the body, and air resistance, which depends on
velocity and does not depend on mass. The motion of the body is determined by
the net force.
The net force on the ball is mg + D(v), where m is the mass of the ball, g is
9.81 m/s^2, and D(v) is the function that gives air resistance of the ball at
velocity v. (Note: I'm using a coordinate system where balls fall in the
positive direction. In this system D(v) will be negative).
The net acceleration on the ball is net force divided by mass. This is g +
D(v)/m.
Note the effect of varying the mass, leaving all else the same. Remember, D(v)
is negative, so the effect of the D(v)/m term is to reduce the net
acceleration the ball feels. In other words, it is to make the ball fall
slower.
If we raise the mass, we reduce the magnitude of D(v)/m. We reduce the amount
that air resistance slows down the ball. If we lower the mass, the opposite
happens. Air resistance slows down the ball more.
For a given ball, as it falls and picks up speed, the air resistance goes up,
becoming more and more effective at countering the gravitation force. This
puts an upper limit on how fast the ball can fall--the so-called "terminal
velocity". This is the velocity where D(v) = -gm. Note that for balls with
larger mass, terminal velocity will be higher.
[1] a linear function at very low speed with no turbulence, a quadratic
function in most situations we normally encounter in everyday life. The
quadratic is 1/2 p Cd A v^2, where p is the density of the air, Cd is the drag
coefficient (0.47 for a sphere), A is the cross sectional area, and v is the
speed relative to the air.
~~~
ctchocula
You are correct only if all else is the same. However, I take issue with the
assumption that you can leave all else the same. The net force on a heavier
object is a higher for the lead ball than for the wood ball. The net force for
the lead ball is m_l x a, but the net force on the wood ball is m_w x a where
m_l>m_w, and the acceleration a depends on the shape of the object and can be
thought of as the same.
~~~
tzs
We were given that the lead ball and the wood ball were the same size. I am
assuming that they are both spherical.
The drag force is 1/2 p Cd A v^2 where p is the density of the air, Cd is the
coefficient of drag, A is the cross sectional area, and v is the velocity.
If both balls are spheres, Cd is the same for them (0.47). The air density is
the same for both. The cross sectional area is that same. Hence, the two balls
have the same drag force at the same velocity. Hence, the deceleration from
drag is lower for the heavier ball.
Here's an example with a basketball and a bowling ball showing what happens:
[https://www.youtube.com/watch?v=mGZLuaJ5MOc](https://www.youtube.com/watch?v=mGZLuaJ5MOc)
Note that you need quite a long drop to see a noticeable difference.
Below is a simple simulator that drops two spherical balls of the same size
but different mass, and prints how far they have fallen and how fast they are
going each second for the first 10 seconds. I'll give some results first, and
then the simulator code if anyone wants to play with it.
Here are the results for a 4 cm diameter lead ball and a 4 cm diameter ping
pong ball:
0.0 (0.0, 0.01) (0.0, 0.01)
1.0 (4.9, 9.79) (4.11, 6.99)
2.0 (19.51, 19.38) (12.02, 8.39)
3.0 (43.54, 28.63) (20.51, 8.54)
4.0 (76.58, 37.37) (29.06, 8.55)
5.0 (118.06, 45.5) (37.62, 8.56)
6.0 (167.34, 52.94) (46.17, 8.56)
7.0 (223.7, 59.66) (54.73, 8.56)
8.0 (286.41, 65.64) (63.29, 8.56)
9.0 (354.74, 70.91) (71.84, 8.56)
10.0 (428.0, 75.5) (80.4, 8.56)
Numbers in each row are: time in seconds since drop, distance first ball has
fallen (in meters), velocity of first ball (meters/second), and the distance
and velocity of the second ball.
Here's a pair of 12 cm diameter balls one weighing 16 pounds (maximum weight
for a bowling ball), and one about the weight of a basketball:
0.0 (0.0, 0.01) (0.0, 0.01)
1.0 (4.9, 9.76) (4.75, 9.2)
2.0 (19.4, 19.18) (17.42, 15.59)
3.0 (43.03, 27.97) (34.92, 19.0)
4.0 (75.04, 35.91) (54.81, 20.56)
5.0 (114.53, 42.9) (75.76, 21.23)
6.0 (160.5, 48.88) (97.15, 21.51)
7.0 (211.96, 53.89) (118.72, 21.62)
8.0 (267.99, 58.02) (140.37, 21.67)
9.0 (327.74, 61.37) (162.05, 21.69)
10.0 (390.51, 64.06) (183.74, 21.69)
The simulator is just doing a simple linear simulation that assumes constant
velocity and acceleration during between simulation steps. That's not super
accurate, but it is good enough to show the physics.
Simulator code below. Set r to the radius in meters of your spheres. Set m1
and m2 to the masses of your two spheres, in kilograms.
#!/usr/bin/env python3
import math
p = 1.225 # kg/m^3
Cd = 0.47 # for sphere
r = .12 # m
m1 = 7.2 # kg mass of 4 cm lead ball
m2 = .625 # kg mass of 4 cm ping pong ball
def drag(v): # m/s
# 1/2 p Cd A v^2
return 1/2 * p * Cd * math.pi * r**2 * v**2
def sim(m1, m2):
y1, y2 = 0.0, 0.0 # m
v1, v2 = 0.0, 0.0 # m/s
dt = 0.001 # s
for ms in range(0, 10001):
y1 += v1 * dt
y2 += v2 * dt
v1 -= drag(v1) * dt / m1
v2 -= drag(v2) * dt / m2
v1 += 9.81 * dt
v2 += 9.81 * dt
if ms % 1000 == 0:
print(ms/1000, (round(y1,2), round(v1,2)), (round(y2,2), round(v2,2)))
sim(m1, m2)
~~~
ctchocula
I appreciate your earnestness, but running your code gives:
0 (0.0, 0.01) (0.0, 0.01)
1 (4.91, 9.82) (4.91, 9.82)
2 (19.63, 19.63) (19.63, 19.63)
3 (44.16, 29.44) (44.16, 29.44)
4 (78.5, 39.25) (78.5, 39.25)
5 (122.65, 49.06) (122.65, 49.06)
6 (176.61, 58.87) (176.61, 58.87)
7 (240.38, 68.68) (240.38, 68.68)
8 (313.96, 78.49) (313.96, 78.49)
9 (397.35, 88.3) (397.35, 88.3)
10 (490.55, 98.11) (490.55, 98.11)
~~~
tzs
It's Python 3. You ran it with Python 2. In Python 2, 1/2 == 0. In Python 3,
1/2 == 0.5. That means that in the drag function, this expression:
1/2 * p * Cd * math.pi * r**2 * v**2
always gives 0 on Python 2 because of that 1/2 factor.
If you want to run it on Python 2, either change the 1/2 in the drag function
to 1./2 or 0.5, or add
from __future__ import division
at the top to tell Python 2 you want to use the Python 3 division behavior.
~~~
ctchocula
Then I stand corrected. Thank you for being patient.
| {
"pile_set_name": "HackerNews"
} |
How to Steal a Phone Number (And Everything Linked to It) - urahara
https://www.fastcompany.com/40432975/how-to-steal-a-phone-number-and-everything-linked-to-it
======
alexandrerond
Meanwhile Google tries by all means to obtain your phone number and until very
recently tried really hard to have you use SMS for 2FA.
| {
"pile_set_name": "HackerNews"
} |
Implementing Stripe-Like Idempotency Keys in Postgres - craigkerstiens
https://brandur.org/idempotency-keys
======
aejnsn
Craig, attended your talk at ATO earlier this week--good talk. I was the guy
who asked about performance considerations of joins in Postgres using UUIDs.
This post reads like it's vaguely implementing what should be in a queue
backend. There's a locked_at field in the schema, furthermore should this not
be performed via ...FOR UPDATE?
~~~
skrebbel
Care to share the answer to your question about uuid join performance? I've
had a hard time finding much about that on the internet.
~~~
craigkerstiens
Sure, there is definitely a little extra overhead on the join performance,
though my experience I've seen so many other issues become the biggest
bottleneck before joins of UUIDs. We regularly used UUID as identifies at
Heroku Postgres and use them at Citus as well and they work extremely well for
us.
It is of note that we're actually using the UUID datatype though and not just
generating a UUID and throwing it into a text field.
~~~
kbenson
> It is of note that we're actually using the UUID datatype though and not
> just generating a UUID and throwing it into a text field.
I was thinking that a UUID datatype implemented as a series of ints could have
fairly good join performance, since you can effectively treat it as a series
of separate smaller int indices that you join across, and I imagine that's a
well understood and optimized problem for years now. A text UUID field though,
ugh, that just seems so wasteful even before you get to optimization
techniques.
~~~
grzm
Reading your comment, it's not clear to me whether you're aware that the UUID
datatype in PostgreSQL is a 128-bit value as opposed to a text field; or if
you're just speaking of applications that use a database where native UUID is
not available. If the latter, feel free to ignore the rest of this :)
For comparison in storage:
select pg_column_size('DDB5E9ED-60B6-4405-A6A8-18E339C7B172'::uuid) AS uuid_bytes,
pg_column_size('DDB5E9ED-60B6-4405-A6A8-18E339C7B172'::text) AS text_bytes;
uuid_bytes | text_bytes
------------+------------
16 | 40
(1 row)
As 'craigkerstiens points out, the index performance of the native UUID
datatype is very good: it's not a string comparison.
If you're using the UUID type to encode information that you may want to
access subfields of, you could create function indexes to do so. Otherwise, I
think indexing on the native UUID type is going to be better than a collection
of narrower fields and indexes.
Edit: Spell 'craigkerstiens correctly.
~~~
kbenson
> Reading your comment, it's not clear to me whether you're aware that the
> UUID datatype in PostgreSQL is a 128bit value as opposed to a text field
I was speaking towards the idea that some people throw UUIDs into text fields,
as implied by the parent;s clarification that they were _not_ doing that. For
example, in years past when that type may not have existed yet. People have
been talking about "just use a uuid for the primary key" for at least around a
decade now.
> As 'craigkierstens points out, the index performance of the native UUID
> datatype is very good: it's not a string comparison.
I would assume so! Any database that implemented an actual 128 bit numeric
type (no matter if it's converted to hexidecimal notation for display, ints
aren't stored in decimal either) as a text field would deserve quite a bit of
ridicule, in my opinion.
> If you're using the UUID type to encode information that you may want to
> access subfields of
And that answers a question I raised somewhat implicitly in my other comment,
which is why use a UUID in the first place. Because you can encode specific
sub type information into the parts. Thanks!
Edit: Forgot some words in my paragraph that mentioned ridicule, so I added
them to make sure it was obvious what and why I thought they would deserve
some ridicule (which is actually a bit stronger than how I generally like to
come across).
------
ianamartin
This is a really great article. I especially like that the author pointed out
some of the pitfalls with non-acid systems.
I started wondering at the beginning of the article how many people were going
to read it, try to implement it in mongo, and faceplant.
The only thing that strikes me as odd about the hypothetical example is that I
can’t imagine a universe where you would want a request for a ride to complete
72 hours after it initially failed.
I get the point of keeping the information around for 3 days in case of
errors. And perhaps my imagination is lacking, but most of the cases where I
would want to implement this model are time-sensitive. I need a ride now. I
need a message guaranteed to deliver exactly once.
Actually, the message queue aspect is how I think of this in general. It’s an
application/database layer that takes an at least once mq system and forces it
to be at most once. The combination then seems to imply exactly once.
Actually, after writing that out l, I can now think of a really great use case
for this that isn’t time sensitive. ETL systems.
Let’s say you are streaming live updates from some third party system to some
staging area and triggering processes from there through a broker to transform
and load data into a warehouse.
Your team pushes an update that breaks something on Friday night. Nobody works
that much on the weekend, so no ones cares. The dev lead gets notifications,
but gets to take it easy until Monday morning because they (trigger warning:
intentional singular they) know that everything can be replayed and resolved
after the bug is fixed. Even in the case of frequently changing api data, you
have the entire mutation history stored in your table and can accurately
replay it.
I can get on board with this.
And yeah. Sorry for thinking out loud. Now I’m thinking of all kinds of place
in my web apps this can be applied, as the article suggests. I think I need to
write a Python wrapper around this concept and make it dead simple to include
in projects.
Thank you again to the author for this very clear and excellent article.
------
Dowwie
Why not use Twitter's snowflake id? Instagram blogged about a Postgres
function that generates them
[http://rob.conery.io/2014/05/28/a-better-id-generator-for-
po...](http://rob.conery.io/2014/05/28/a-better-id-generator-for-postgresql/)
~~~
bgentry
Which of the problems covered in this post would be solved by Snowflake?
------
mazerackham
This is a really great article that lays out a proper implementation of
idempotent requests, something that many people (even very experienced
developers) regularly mess up. Thanks for sharing
------
no1youknowz
I'm using Memsql and it doesn't seem to have a UUID type, but after a quick
search with Mysql and UUID, the binary type is being used.
Can this be used with Memsql with the binary type and will I get the same
benefits?
Thanks
~~~
positr0n
Yes I've seen uuids in databases in both binary and varchar(36) forms.
Interacting with them with your tools/ORM won't be as nice as the native type,
but it's definitely doable and you get most of the benefits.
~~~
no1youknowz
Don't use an ORM, but otherwise thanks for the clarification!
------
poorman
I always throughly enjoy reading Brandur's articles.
------
sandGorgon
how do you guys implement atomic_phase ? im not a ruby guy, so am not able to
figure out the definition.
It looks like a atomic check-and-update, but more sophisticated.
------
koolba
This is a fantastic article.
------
naasking
"Idempotency keys", sheesh. Just use standard terminology: these are just
futures/promises.
~~~
dboreham
Wait...what? I'm all for "standard terminology" but they're for sure just not
the same thing. At all.
~~~
naasking
Sure they are. The idempotency key is simply a representation of a promise.
The E language and the Waterken server implement promise semantics exactly
like this: bind the results of a local or remote computation to a durable
representation, and that representation is used to store the result upon
completion. All future requests return the that stored result, ie.
idempotency.
In case of network partition where the client doesn't receive a reply, it can
safely and simply retry the operation on the promise, which returns the stored
result.
The E language popularized futures/promises, and it's still the gold standard.
~~~
saltcured
I think you are arguing a very niche perspective here.
The concept of idempotence in mathematics predates futures/promises by about a
century and is part of the general background understood by most people
exposed to any CS concepts. It has been prevalent across a wide range of
communities discussing networking protocol and message-passing designs since
the inception of the Internet. It appears in countless RFCs and other
standards documents which are read by people building web-based systems.
The concept of futures/promises you are promoting is honestly a niche part of
computer science from the 1970s which persists in an intersection of
distributed systems research and programming language research. It was never
the prevalent formalism shared by general CS or network engineering
communities. Meanwhile, a different aspect of the same 1970s concept has had a
rebirth as an asynchronous programming syntax and appears in widely consumed
languages. This different camp of PL researchers has essentially won the
popularity contest which defines the new "standard" meaning of futures and
promises for the general technical audience.
~~~
naasking
I'm not arguing against idempotency, or attempting to redefine it, I'm
pointing out that "idempotent key" as used in this article precisely denotes a
future. You might as well write an article on a "text-based client/server
GET/PUT/POST protocol" without labelling it HTTP.
It's stupid and unnecessarily obfuscated. This abstraction has a well-known
technical name, so use it, don't invent a new term.
And I frankly find it bizarre that you call futures/promises "niche" when
they're now part of every web browser in the world and millions of programmers
use them every day. The fact that these futures aren't durable isn't relevant
since the browser environment itself is transient.
~~~
saltcured
I find it bizarre for you to invoke the niche E programming language as if it
is obviously relevant to any use of unique identifiers for detecting replay of
messages. Why don't we criticize the author for his lack of Petri nets, or his
glaring omission of a denotational semantics in the appendix to his article?
At this point in the discussion, I am unsure that you understand promises in
distributed systems formalism, because you seem to be conflating it with the
ones recently en vogue in Javascript and other popular languages. Those
futures/promises in every web browser in the world are a purely local
asynchronous programming idiom, or what I like to think of as CPS
unconversion. I didn't suggest that browser programming is a niche concept,
but it also has nothing to do with organizing idempotent remote state changes
over a lossy message channel.
There is a vast literature using many different forms of explicit identifier
tied to some form of idempotence. Do you run around the IETF and W3C telling
them to stop saying "Message-ID" when they really mean futures/promises? Or
security researchers with their nonces? How about those pesky IPv4 and IPv6
designers with their silly ports, SYN cookies, and sequence numbers?
~~~
naasking
> Why don't we criticize the author for his lack of Petri nets, or his glaring
> omission of a denotational semantics in the appendix to his article?
Oh, was I asleep when Petri nets and denotational semantics are now used by
millions of programmers around the world?
> Those futures/promises in every web browser in the world are a purely local
> asynchronous programming idiom, or what I like to think of as CPS
> unconversion. I didn't suggest that browser programming is a niche concept,
> but it also has nothing to do with organizing idempotent remote state
> changes over a lossy message channel.
Of course it does, and I already made this point so I don't know why you're
ignoring it. Add durability to browser promises, and there you go. The
durability is irrelevant to the semantics.
> Do you run around the IETF and W3C telling them to stop saying "Message-ID"
> when they really mean futures/promises? Or security researchers with their
> nonces?
Nonces and Message-IDs have different semantics than promises. It's like
you're not even listening.
Idempotency keys have the same semantics as promises. Ergo, they are promises.
Promises are idempotent, but not every idempotent abstraction is a promise.
It's simple, I'm not sure where the confusion lies.
~~~
saltcured
You seemed fixated on establishing the primacy of the E programming language
as the inventor or rightful heir to the use of keys to correlate messages and
establish idempotent operations. So, I casually mentioned "prior art" of keys
used to manage message retry, deduplication, and replay avoidance.
For more complete examples, consider SMTP, NNTP, and their UUCP predecessors
which provided idempotent message transfer and eventually-consistent
replication using client-generated, globally distinct IDs. I absolutely fail
to see how these are different than a tweet ID or any similar solution for the
web. It is such standard practice, we take it for granted like the use of 2's
complement arithmetic.
The concept's reappearance in Twitter, Instagram, countless bespoke CRUD
applications, or the PostgreSQL example in this article are not "actually
promises" just because they're on the web. They are just the umpteenth
instance of source-labeled resources being tracked in a distributed system.
That is the web abstraction: a resource is created and we do not care whether
it is an event, object, message, computational job, bank transaction, or
packet of curry being delivered to my doorstep next Tuesday.
A Javascript promise is a local transform for asynchronous control flow. It
has zero intrinsic connection to any IO performed in that flow, nor in the
remote computation or storage that might occur. The programming language stops
where the message leaves the language's runtime and enters the real world of
distributed systems. It is not the control flow syntax which makes my
communication idempotent, it is me and the recipient agreeing to overlay
idempotence semantics on some data we agree to encode into our messages.
Message formats have nothing to do with programming languages. This is how we
understand Internet architecture. The local programming language and
programming paradigm of each node in the distributed system is immaterial to
the messaging semantics and interop standards and their concepts.
| {
"pile_set_name": "HackerNews"
} |
New mouse design raises questions - SamLicious
https://realhardwarereviews.com/quadraclicks-gaming-rbt/
======
SamLicious
this is Qeric, I made the RBT to fix my own RSI, ask me anything!
| {
"pile_set_name": "HackerNews"
} |
Rosetta catches dusty organics - okket
http://blogs.esa.int/rosetta/2016/09/07/rosetta-catches-dusty-organics/
======
okket
Just a few days left for Rosetta, you might want to mark the date in your
calendar:
On September 29, a rocket burn will essentially cancel out
Rosetta's orbital motion around the comet, initiating a
free fall from an altitude of 20 kilometers. The spacecraft
will impact the comet at a speed of about 90 centimeters
per second at 04:20 PDT / 07:20 EDT / 11:20 UTC / 13:20
CEST, give or take 20 minutes.
[http://www.planetary.org/blogs/emily-
lakdawalla/2016/0909102...](http://www.planetary.org/blogs/emily-
lakdawalla/2016/09091029-rosetta-end-of-mission-update.html)
(The rocket burn is on the 29th, the impact on the 30th, of course.)
~~~
celticninja
Is it possible the impact will affect trajectory of the comet?
~~~
symmetricsaurus
No, Rosetta is already in orbit around the comet. By the principle of
conservation of momentum the impact will not change the total momentum of the
system (comet plus spacecraft).
The changes in trajectory happens when the engines are used on the spacecraft.
~~~
dandelany
Would this still hold true if Rosetta were commanded to slam into the comet
hard enough for some of the impact debris to reach escape velocity? (Not
trying to nitpick your argument, it certainly holds true for the soft landing
case, but I'm curious how strong this axiom is)
------
zengid
Can anyone comment on how significant this discovery is? Are we talking
unprecedented or has this been hypothesized for a while? Are the conditions on
such comets more conducive for organic molecule formation than the surface of
a planet or large moon might be?
~~~
lorenzhs
According to the article, they were expecting it:
_From analyses of meteorites and laboratory simulations, the team was also
expecting to identify a wide diversity of organic material in Comet 67P /C-G,
ranging from very small molecules to heavy (or ‘high molecular weight’)
organics._
------
the_duke
Organic =
[https://en.wikipedia.org/wiki/Organic_compound](https://en.wikipedia.org/wiki/Organic_compound)
~~~
lorenzhs
I doubt anyone would expect to find organically grown veggies on a comet. Not
sure what you're trying to say. Isn't it pretty clear that they're referring
to molecules containing carbon?
~~~
Practicality
There was a related discussion recently where "organic" in an article was
referring to life-forms. So it's good to clarify which organic is being
discussed.
Of course, if they found microbes on a comet this would be a much bigger
announcement.
------
mdup
Genuine question: How do we know the organic compounds seen by Rosetta do not
come from Earth and were not brought by Rosetta itself?
~~~
giomasce
Usually spacecrafts like that are built in clean rooms and care is taken so
they are sterilized when sent to space. I have no specific answer, though.
~~~
ygra
This will be _especially_ true for instruments that are designed to find such
compounds. It would be fairly embarrassing to send the probe into space, only
to notice that apparently interplanetary space is full of organic compounds no
one else has ever seen ...
------
CHsurfer
“These particles have remained pristine and untouched for billions of years
until they were released in the days or weeks before being ‘caught’ by
COSIMA,"
I hope they will release the particles back into the wild once they are done
with their analysis.
~~~
aokyler
I'm not against releasing the particle...but any specific reason you hope they
release them once back into the wild after?
------
r0muald
Anyone could explain briefly what is the "stretch" theory or hypothesis that
is mentioned in the heated comments section to the blog post?
~~~
scardine
I can't - and hell, I barely understand anything they say - but the style
reminds me of how Asimov would picture an academic dialogue between two
rivals.
| {
"pile_set_name": "HackerNews"
} |
Find your voting location with SMS - mikesabat
http://mcommons.com/where-do-i-vote-via-txt-message
======
josefresco
Is it that hard to Google your town name and find out where to go vote from
your Government town website?
/devil's advocate
------
raganwald
When it comes to helping people exercise their right to vote, multiple
redundant data channels are good. Also, SMS is mobile and works on all sorts
of "ancient" handsets.
------
Jasber
Google Maps also has this: <http://maps.google.com/vote>
| {
"pile_set_name": "HackerNews"
} |
Alexa Auto Software Development Kit (SDK) - otp124
https://developer.amazon.com/alexa-voice-service/alexa-auto-sdk
======
otp124
The SDK is on Github too: [https://github.com/alexa/aac-
sdk/tree/master/builder](https://github.com/alexa/aac-sdk/tree/master/builder)
I was debating posting the Github repo, but the landing page provides more
info, so I went with the landing page.
| {
"pile_set_name": "HackerNews"
} |
YourMechanic (YC W12) Introduces Pre-Purchase Car Inspections - bgar
http://techcrunch.com/2013/10/04/yourmechanic-pre-purchase-car-inspections/
======
ttruong
This is great. I'm actually in the process of buying a used car now and it's
nearly impossible to get an appointment at a garage in SF in a timely manner.
| {
"pile_set_name": "HackerNews"
} |
How I Can Has Cheezburger became a web empire - jlangenauer
http://www.nytimes.com/2010/06/14/technology/internet/14burger.html?partner=rss&emc=rss
======
MartinCron
In a move that isn't at all shocking, the NY Times reporter didn't have much
interest in talking with me or any of the other Cheezburger Network developers
when she was in our office a few weeks ago.
If anyone is curious about the technical side of what we do and how we do it,
I'll gladly answer questions here.
Also, we're looking for software developers. We're in Seattle, and have a
strong bias for local talent, but we have a bunch of remote deveopers. Of the
dozen or so companies I've worked for, Cheezburger is hands down the best.
Srsly. Email [email protected] if you're interested.
~~~
iamjustlooking
I'm sure from a reporter perspective it doesn't appear like there's much of a
technical side to what appears to be a wordpress blog with cats on it.
~~~
MartinCron
That's why I would expect a reporter to, when introduced to a bunch of
technical people to ask, "what? there's a technical side?"
Then we could talk about how we make systems to help the editorial team sort
through the 16,000 content submissions we get every day, or how we have a
public API so people can submit funny content from their applications.
The "wordpress blog with cats on it" is a very shallow view of the company.
There's a lot more going on than that.
~~~
fortes
_Then we could talk about how we make systems to help the editorial team sort
through the 16,000 content submissions we get every day, or how we have a
public API so people can submit funny content from their applications._
So could you talk about some of that now? I'm genuinely curious.
~~~
ssharp
Really. If you want your story to be heard, tell it. Start talking here or
start an IamA.
------
mikeleeorg
I'm a little surprised that no one's caught on to the real business model
behind Cheezburger Network.
What they're doing is pretty clever. It's more than just a bunch of silly
captions under photos of cats. They're building a system that capitalizes on
Internet memes that can be presented in a blog-like fashion.[1]
That, to me, is a mark of a potentially sustainable business model. To break
it down in more detail:
a) They monitor Internet memes
b) They evaluate which ones are worthy of a trial run
c) They launch a site around it
d) They assess that site's success (using whatever metrics they've discovered
are important, such as PVs and unique visitors for advertisers)
e) They pull the plug if the trial is not successful
It's not a new business model, but it's a clever one that's obviously working.
I'm particularly impressed that they're aiming to launch a site every week,
and kill only 20% of those experiments. I wonder if that number is low.
That's why they're more than just a WordPress blog with cats on it.
[1] The similarities to a blog are probably incidental. I imagine they
basically want a way to display recent user-generated entries and allow other
users to comment on them. A blogging system naturally satisfies these
attributes. Otherwise, I don't think they necessarily need their sites to look
like blogs, per se.
~~~
MartinCron
I think a lot of people (including yourself) have caught on to the business
model. It's just that many people can't get past the WTF factor of the funny
cat site to see that there's a serious business there. Nobody is keeping what
we're doing a secret.
It's not just "Internet memes" though, it's more about finding untapped types
of crowd-source-able humor. There wasn't a popular "there, I fixed it" meme
before we launched thereifixedit.com
I haven't been keeping explicit score, but the killing 20% figure feels about
right. Our count of surviving sites has been constantly growing, we're up to
53 at the moment.
~~~
mikeleeorg
Ah yes, crowd-source-able humor! Now I'm even more impressed. 53 sites is a
lot. Do you have people that oversee the content & comments on those sites?
FYI for anyone else who is curious like I am about what those 53 sites might
be: <http://cheezburger.com/sites>
~~~
MartinCron
Yes. We have a lot of people overseeing content and comments, it's most of our
staff. Each person generally does between 4 and 6 sites, with the sites
generally posting between 1 and 6 items to the front page (featured content)
daily.
For each thing you see on any front page, there are dozens of things that
didn't make it through voting, and for each of those things on the voting
pages, there are dozens of things that weren't good enough for voting. We're
able to crowdsource the generation/submission of content as well as (some of)
the curation of content. A lot of what makes a site work is finding the right
balance between editorial vision and letting users vote on what they like.
------
alttab
Amazing to hear the numbers. I'll file this one under ifart and question my
career again. I can has breakdown.
~~~
chime
Creativity comes in all flavors. Someone creates a Jump-to-Conclusion mat and
makes a million while someone spends their entire life working on elliptic
curves to improve cryptography. Get-rich-quick stories entice even the most
brilliant of the minds and we're no different. A story about some kid making
$35k/month from a stupid app for iPhone makes people wonder why they even
bother learning Clojure or spending hours improving their writing skills. The
truth is, you must ask yourself what will make you truly happy - being the
developer of a gimmick app that made money or someone who has directly
contributed to the world with their efforts.
Upon hearing the above, many have said "well, once you have lots of money, say
like the reddit guys, you can spend your time doing what you love and help the
world." No. The reddit guys WERE doing what they loved and would have done
what they're now doing regardless of the money. Sure, having money like Elon
Musk helps but then again, it may not. If you hate your career now, get the
ball rolling to change it. If you are happy with it, who cares if someone else
made millions sharing pictures of cats? I mean do you feel bad that someone in
India made $10m selling a special type of plastic sheet to farmers? Or someone
in South Africa made millions selling vuvuzelas?
If your ultimate goal is to make money, then certainly wonder if you chose the
right career. Money laundering or even investment banking might be a better
path. Otherwise, realize the money is just a serendipitous by-product of any
venture.
~~~
roel_v
"Or someone in South Africa made millions selling vuvuzelas?"
If I ever find that guy, he's in for a world of pain.
~~~
Luc
The Vuvuzela (TM) maker was once a small startup that won an entrepreneurship
competition and got help from an incubator:
[http://www.fin24.com/Companies/SAB-moves-to-protect-
vuvuzela...](http://www.fin24.com/Companies/SAB-moves-to-protect-
vuvuzela-20040519)
I have mixed emotions about this one, for sure.
------
ck2
Step 1: Take content from everyone and everywhere else and put it on your own
site.
~~~
jessriedel
More like, come up with a great web-interface to allow a single, one-time idea
from an individual user (which otherwise would have been heard only by him and
his friends) to be made available to the entire internet.
------
mynameishere
If the comment section on icanhascheeseburger doesn't drive you to suicide,
you're probably safe keeping that 45 in your house.
~~~
Daniel_Newby
I'd forgotten about the comments. Blocked them out, really.
------
thunk
Well, that and toxoplasmosis.
------
seanalltogether
To me the cheezburger network is like the paparazzi of internet culture.
Chasing down memes and raping them in front of thousands of viewers to provide
a flash of cynical entertainment before moving on.
~~~
px
Trivializes rape a little bit, doesn't it?
| {
"pile_set_name": "HackerNews"
} |
Screaming Architecture - huan9huan
https://8thlight.com/blog/uncle-bob/2011/09/30/Screaming-Architecture.html
======
grzm
[2011]
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Twitter/Posterous - The Archive Team needs more time to save 1.3M blogs - jmathai
TL;DR<p>Posterous is shutting down in a week and the Archive Team needs more time else 1.3 million blogs will essentially disappear.<p>--<p>I've been following an exchange between Jason Scott (Archive Team / Internet Archive) and Sachin Agarwal (Posterous) [1].<p>It appears that Posterous gave the Archive Team some dedicated servers to hit but it wasn't sufficient to download the amount that's going to be deleted.<p>I don't know all of the details but saving content to a historical archive is invaluable. Offering the ability for users to download their content is great but it serves a very different purpose than what the Archive Team and Internet Archive does.<p>Additionally, it's certain that much of the content will disappear because users didn't receive the shutdown email, 30 days wasn't long enough or simply didn't bother to do anything. The public content there is still valuable.<p>[1] https://twitter.com/agarwal/status/327153883237453825
======
ddorian43
Everyone who wants to help can install the Archive Team Warrior.
The ArchiveTeam Warrior is a virtual archiving appliance. You can run it to
help with the ArchiveTeam archiving efforts.
<http://archive.org/details/archiveteam-warrior>
~~~
nucleardog
Installed and running. It's a quick download. If you already have some sort of
VM software installed, getting it running is pretty painless.
(VMware said it wasn't compatible, but gave me the option to retry import with
relaxed restrictions. I had to move the second (scratch) disk from Secondary-
Slave to Secondary-Master, but it came up no problem after that.)
Once it was running, it's just a simple web interface you can use to select
what project you want to participate in.
All in, it was about a five minute affair.
You can track the current progress of the Posterous download at
<[http://tracker.archiveteam.org/posterous/>](http://tracker.archiveteam.org/posterous/>).
Currently they have 4,846,523 items done, and 1,277,126 to go. They need all
the help they can get.
Thanks for letting me know about this, ddorian43. I've got a reasonably speedy
connection, lots of hard-drive space, and definitely don't mind helping out.
------
hub_
Simply put, former Posterous and Twitter are being douchey.
~~~
bochoh
TL;DR Twitter = Evil
------
lsiebert
Seems like someone could still donate server time for charity pr.
| {
"pile_set_name": "HackerNews"
} |
Man and Media (1979) - dredmorbius
http://www.marshallmcluhanspeaks.com/lecture/1977-man-and-media/
======
dredmorbius
Marshall McLuhan discussing the impacts and dynamics of new technologies,
particularly media technologies. Audio, 8 minutes.
PDF transcript:
[http://www.marshallmcluhanspeaks.com/media/mcluhan_pdf_12_ZS...](http://www.marshallmcluhanspeaks.com/media/mcluhan_pdf_12_ZS6aMYZ.pdf)
| {
"pile_set_name": "HackerNews"
} |
It's time to switch to a four-day working week, say two experts - joeyespo
https://www.weforum.org/agenda/2019/01/2-davos-experts-says-it-s-time-to-switch-to-a-four-day-working-week/
======
peatmoss
I think three day weekends would be undeniably good for the health of our
society and everyone living in it.
I have long (semi-jokingly) professed belief in the Church of the SubGenius.
They extol the virtues of slack. I think the intended meaning of “slack” is as
in “to slack off.” But I choose to believe in it as slack (i.e. spare
capacity) in a system.
While some might use that extra day to slack off, plenty more would use that
day to invest labors in their communities, maybe to get some exercise, maybe
do some neglected repairs around the house.
I feel like in the name of efficiency, we’ve purged a lot of slack from the
system, but that has left us with a lot of institutions that are at risk for
catastrophic failure. For people who are stretched to their breaking point,
there needs to be more slack.
~~~
Swizec
According to the theory of constraints (on which I am expert since I read 1
graphic novel, The Goal, and 1 novel, The Phoenix Project) systems without
slack become exponentially slower until no work is capable of getting done
anymore.
The reason for this are statistical perturbations in stochastic events. If a
task takes a worker rand(1,4) units of time you can expect them to do day/1 to
day/4 tasks in the day.
You look at the worker doing day/1 tasks and you say, "Well shit, that person
is slacking most of the time. Most of these tasks take 2 units, 4 is very
rare". So you ask them to work harder and impose rules so they must perform
day/3 number of tasks per day.
You look at your constantly busy workers and you're happy. No more slack in
the system.
But your assembly line grinds to a halt. Nothing ever gets done anymore.
Everyone is busy all the time. Everyone's always working. But nothing is
finishing.
What gives?
Turns out any task that hits N=4 on the random curve, wreaks havoc and you
fall behind. Then you have both yesterday's _and_ today's tasks to do. You
can't. The next day ... well the problem ends up growing exponentially.
Efficient systems have slack.
~~~
sonnyblarney
This is a very good bit of information, but most modern workplaces are not
factory floors 'processing work items', so the analogy only goes so far.
Also, it very well could be that reducing to 4 from 5 does nothing but simply
reduce the number of hours 'the factory is operating'. Surely there'd be a
bonus because people might be more relaxed, but it's hard to weight that
against the lost day.
For example, my local coffee shop is open 7 days a week, let's say that their
staff works 5 days on, 2 off so there's a steady rate of staffers.
If they just switched to 4 on 3 off ... I'm not sure anything would materially
change in terms of productivity etc. - and 'all else being equal' (i.e.
demand, cost of goods, wages/hour), the staffers would simply make less money
and I doubt they'd chose that.
But maybe '3/4 days' would be better in some industries than others.
~~~
Swizec
> most modern workplaces are not factory floors 'processing work items', so
> the analogy only goes so far.
I recommend reading The Phoenix Project. It does a great job of dispelling the
myth that software engineering is unpredictable and artisanal and whatnot. At
the abstraction layer of a department or division manager it becomes
indistinguishable and the same stochastic modeling principles apply.
But that’s besides the point. A more important point is the idea of
effectiveness.
4 days of effective work are often better than 5 days of being busy but
ineffective.
Personally I get more done in a 4 day week because I am more cut-throat about
saying No to things that don’t matter.
------
epaulson
We should all aim for 4 days as the ideal instead of 5, but we should also
drop the M-F work, S-S weekend ideal too. For large chunks of the workforce
they're already working something other than M-F anyway because we want 7 day
coverage of our retail and service sectors. More professionals - dentists and
veterinarians and insurance offices and you name it - should be open on the
"weekend" or should not all dentists need to share the same "weekend",
especially if we're dropping down to a 4-on, 3-off standard.
If for no other reason, we build a lot of infrastructure for "peak" usage,
like rush hour traffic. If we all have the same 3 day weekend that means we
have lower "weekend" traffic one more day but the peaks stay the same, but if
we better distributed our weekends, overall peak would go down a bit.
It's a whole new set of coordination problems, of course, but we don't all go
to church on Sundays anymore, we don't all need to be off the same day.
~~~
david-gpu
I get what you are saying, but presumably weekends still exist because
families and friends want to do group activities at a time when all are
available. If you have children younger than 12 or so you need to be available
when they are not in school.
~~~
kennywinker
If you have children of working age they are likely working service industry
jobs and work weekends and evenings anyways.
~~~
obmelvin
Children that old don't need to be taken care of as much. It makes more sense
to optimize for taking chare of younger children.
------
b0rsuk
Working as a programmer, I already started de facto shortening my shifts
without cheating on my employer: on 7th, and especially 8th hour, I don't
write new code anymore unless it's in places I have very solid understanding
of. My mind tends to slow down in the final hours, and it takes a lot of
effort to come up with something new, and my bug rate increases.
What do I do instead? Binge read documentation to learn about new functions
and parameters that may make my work easier. Tweak vim configuration.
Experiment with new shell commands. Clean up my email inbox and various
notifications.
These activities still push the work forward, but don't require as much
creative juice and there's no consequence for mistake.
I would still prefer a 4 day week, but it's the next best thing.
~~~
jlawson
I do this too.
Same principle applies in the gym - you start out the workout with the super
strenuous squats and deadlifts, then move to the difficult bench press and
dips, then finish with relatively easy bicep curls, tricep pushdown, and
cardio.
Some things just can't be done effectively unless you're above a certain level
of rest; other things can be done even if you're tired. It just makes sense to
sort things into the period of time where you can actually do them.
In coding this extends all the way to watching lightweight YouTube videos
about coding late at night when you're tired.
------
itamarst
You don't have to wait for society to make this happen (and in fact waiting
will fail you). This is something many programmers have negotiated on their
own (e.g. [https://codewithoutrules.com/2019/05/09/part-time-
software-d...](https://codewithoutrules.com/2019/05/09/part-time-software-
developer/)).
More broadly this is one of many good reasons for programmers to unionize;
even if salaries are high, working hours are still far too long.
~~~
Rapzid
I think too many people are willing to take a pay cut to do this. If you
believe you will be as or more productive, or that your skills and knowledge
are more valuable than just your raw time, you should consider negotiating a 4
day week with no pay cut.
~~~
necovek
Since I am willing to take a pay cut, and even a pay cut per hour, here's my
reasoning.
For every hour spent at work, I am not spending it on something of my own
choice. Hours of my own choice are worth much more than hours of work to me,
because they usually fulfill me more.
So, if I am to work couple hours a day, that's going to cost you little. The
more you take from my "own" hours, the costlier they get. So, if I consider a
normal work week to be 24h, anything above that costs non-proportionally more.
I.e. 40h is not 40/24 more, but it's actually 24 x base_hourly_cost + 16 x
own_factor x base_hourly_work, where own_factor is usually around 2, depending
on how much I might like the base work.
So, in a sense, I am not taking a pay cut, I am just taking a reasonable
salary for doing the work, but if someone insists on taking more of my time
for little benefit to them and a lot of burden to me, it's going to cost
significantly more.
~~~
qntty
You're essentially paying your employer so that you can have more free time.
One could argue that if you feel the need to pay for your own life, you're in
a sense the property of another person. If you do X amount of work, why
shouldn't you be entitled to the fruits of that labor? Why does anybody have
the right to take the value that you created away from you?
~~~
necovek
In European copyright law, they actually can't: non-exclusive (eg. can't be
sold) moral right protects the obvious moral aspects, but there are also
compensation related clauses (eg. I created a "work of authorship" and got
compensated $1000, yet company earned $5 million off it, I'd be entitled to
ask for a fairer share of the profits). I am sure this hasn't been tested in
courts for software development so far, with it being such a collaborative
endeavor, it'd be hard to lay a claim to how much your worm was really worth).
I also take a different view: a person is not owned, but it's normal for
labour to be paid. A labourer can design their own pricing scale to encourage
labour deals they prefer. Just like companies design their pricing schemes to
eg. support only large customers (up to 20 users free with no support, paid-
for afterwards).
------
necovek
I've long argued for a 4-day 6-hour-day work week. I've even questioned a few
potential employers about it, citing research: I've even offered to take a pay
cut (per hour, i.e. hourly rate was cheaper than for 5-day 8-hour day weeks),
but nobody was interested.
They would frequently say how they are not interested in "part time" work. I'd
counter that this is full time work, with efficiency higher than the full time
work, because people can focus on intellectually hard problems only for a
short while sustainably. Sure, I can put in a couple of weeks of 12h days, but
after that, I'd struggle to put in 4h days of quality, focused work (well
known as burnout). Similarly, 8h days are not sustainable either, though it
takes longer to burn out.
As people have noted, the extra time I get would not be spent in pyjamas
watching netflix: it would be quality time with my family, working on projects
and research, etc (if it wasn't for miserable pay and state of academia, I'd
probably be doing research exclusively). Civilization as a whole would benefit
as a result if there were more people putting their brains to problems they
think matter. And as stated numerous times, even employeers would benefit.
But alas, when there's the next guy willing to submit to the "norm", it's hard
to get the ball rolling.
~~~
smsm42
> They would frequently say how they are not interested in "part time" work.
> I'd counter that this is full time work, with efficiency higher than the
> full time work, because people can focus on intellectually hard problems
> only for a short while sustainably.
Is it really true? Can you prove it? I mean you can argue that, but if you're
absent 20% of the time compared to other workers, is it true that your value
is still the same because you're so much more productive? Maybe yes, but can
you prove it to an employer? You're asking them to take a risk on supporting
unfamiliar approach - which their familiar approach probably worked for them
for years and they are fine with it. What do you offer them to justify taking
this risk? I mean, maybe you are so spectacular that employing you is worth
any risk. But naturally most people aren't that exceptional, by definition.
Their experience shows 5-day weeks works great for them, how much better would
be 4-day week to justify the risks?
> Civilization as a whole would benefit as a result if there were more people
> putting their brains to problems they think matter.
Is there any proof that there's significant marginal increase compared to
thousands of existing research institutions that have tens of thousands of
very smart people already spending years attacking practically every important
problem? Would amateurs spending one day a week on side projects significantly
change the picture here - and offset the above-mentioned professionals _not_
spending one day a week on their area of expertise (instead doing their
hobbies in turn)? I am not sure this is that obvious.
~~~
losvedir
People argue it here all the time but I'm certainly skeptical. The hardest
intellectual work of my life, my undergrad at MIT, I spent far more than 8
hours a day, 4 days a week on. The idea that all the undergrads there burning
the midnight oil are just being foolish and don't they know they could get all
their work done in less time doesn't pass the smell test for me.
~~~
akhilcacharya
Most people don't do MIT level intellectual work...
~~~
Freak_NL
Such an effort is also quite different from salaried work. When you are taking
an undergraduate course you put a lot of effort into a finite project. You
tend to have the energy reserve for something bounded like that, but it does
come at a cost. Usually, at the age where you would enrol in an undergraduate
program, the gains outweigh the cost, and the cost is often lowered
significantly by the intrinsic motivation of working on your own interests.
With salaried work the only effective limit is your retirement. Working more
simply means more income, but if that extra income isn't needed, or doesn't
net you enough extra benefits compared to simply having time off, the cost of
being mentally engaged for such a large part of the week just doesn't weigh up
to the gains.
Sure, if your work is so engaging and rewarding that work itself is a
pleasure, than it might pay off. But for most of us it just doesn't work that
way. Work often means doing things were mostly others set the agenda, and
while you may be good at what you do and find motivation in doing it, I've
found that it rarely means that you can do it with the same sustained level of
energy and quality for more than four eight hour days — and even that isn't a
given.
------
wsc981
In The Netherlands /a lot/ of people work 4-day workweeks already [0]. It's
not that novel. But it'd be good if more countries could largely make the
switch.
Due to the progressive tax in The Netherlands, working 5 days instead of 4
doesn't earn /that/ much more money and if you have toddlers, you will spend a
day less for daycare, a day extra with your kids and probably have more time
for the fun things in life as well.
As a salaried employee I often chose a 4-day workweek as well when living in
The Netherlands. But once I started freelancing, the 5-day workweek seemed the
better choice for me. As freelancer you are taxed a bit less compared to a
salaried employee, so there's more incentive to make as much money as possible
during the workweek.
\---
[0]: [https://www.equaltimes.org/a-four-day-work-week-is-
only-a#.X...](https://www.equaltimes.org/a-four-day-work-week-is-
only-a#.XPxZxC2B27w)
~~~
Freak_NL
At my small (Dutch) company the standard contract is 36 hours, so most
employees work 4 × 8. I have a four month old son, and my wife normally works
4 × 8 as well, but she does 3 × 8 + 4 until we find our bearings with the
young one.
That means two days of daycare (partly subsidized by the government), one day,
each, at home with the child, and one day where she works from home for four
hours.
This is fairly typical for white collar workers in the Netherlands. Four days
is more than enough for me.
------
madspindel
"In thirty years America will be a post-industrial society with a per capita
income of $ 7,500. There will be only four work days a week of seven hours per
day. The year will be comprised of 39 work weeks and 13 weeks of vacation.
With weekends and holidays this makes 147 work days a year and 218 free days.
All this within a single generation."
From The American Challenge by Jean Jacques Servan-Schreiber published in
1967. Too bad this will never happen since most managers are workaholics.
~~~
benjohnson
I'm my estimation - you can almost live this way if you limit your
expectations to someone from 1967 - limited food choices, single car, small
home, frugal car-based vacations and heathy living.
~~~
esturk
Funny you added 'healthy living' at the end. It makes you wonder why people
would live beyond their means to be unhealthy.
~~~
sokoloff
Because people are terrible at accounting for the future. Shows up in diet,
exercise, personal finance, and probably a bunch of other places.
------
frankbreetz
There us nothing to disagree with here, but I feel like America is so far to
the right that an idea like this will be answered with "people are so lazy"
and "you signed a contract" or something similarly ridiculous. My response to
all this is wasn't the goal of our forefathers to give us a better life?
Doesn't that mean less hours working? Even if my parents grew up in the best
economic period of the past thousand years, shouldn't my life be marginally
better then their's? Maybe I am a "wuss" compared to the people who stormed
Normandy, but maybe those people are wusses compared to medival people, and
those people are wusses compared to cavemen, shouldn't you want to create the
type of world where your wuss children can survive?
~~~
daodedickinson
The problem is there are so many over-demand jobs and under-demand jobs
because schools stole so many billions from students learning information that
could not be put to fruitful use. And those jobs are so unequal in how much
work they take. So you have many jobs or situations where people are paid to
sit and many like in medicine where more people might die if you want to sleep
a healthy amount. The time should fit the job. But the pay should fit the
time, and then that's the next problem to sove, and on.
------
neilv
That would work for some kinds of work.
But I live to work, and often work 7 days.
I can't imagine doing a project with delivery time pressure (like a startup
trying to execute in a timely manner), working only 4 days, with a 3-day gap.
I'd rather have flexible hours, and an emphasis on working sharp in the hours
we do put in -- not have frequent 3-day interruptions of project mental space,
and putting off gratification in seeing the project come together.
~~~
stevesimmons
Me too. I feel very lucky to have found a company that is a perfect fit for my
skills and interests, and that is at the right stage that me working hard now
will make a big difference.
I can't imagine _not_ working 6-7 days a week.
~~~
Apocryphon
That's great for the both of you. But workaholics shouldn't get to dictate the
norms for the rest of humanity.
~~~
neilv
Agreed, no one should dictate norms.
But I'd rather have expectations of flexibility (e.g., take day off or a short
day because didn't sleep well, or family activity, or enjoy the nice weather,
or just finished a work crunch), than (I imagine) expectations that, on those
4 days, one had better be there the full day and at least pretending to work.
For some jobs, flexibility could mean that someone can arrange with their
manager&team to personally have a predictable 4-day schedule.
------
filleokus
Have there been largeish tests of company wide four-day work weeks in the
software sector?
Here in Sweden the movement has mostly been push by the left, and for workers
with lower wage/burnout/high amounts of sick leave etc. I think it's mostly
been tested in the health care sector. I think it lowered the amount of sick
leave, but was too expensive to keep since those type of jobs are very hard to
make more productive. I.e, we always need nurses on staff at the ER, or to
look after the elderly.
Most office-type work seems that it could potentially benefit from this
though.
~~~
Swizec
I don't know about largeish tests, but 37signals is famous for their 4-day
workweeks during summer.
[https://m.signalvnoise.com/why-we-only-work-4-days-a-week-
du...](https://m.signalvnoise.com/why-we-only-work-4-days-a-week-during-
summer/)
~~~
thatfrenchguy
That article makes it sound like they don’t go on vacation in exchange, which
is a bad deal.
Americans already take pity time off.
------
mto
I work 25 hours a week remotely as freelancer (although I'm available in slack
more or less all the time) and earn nearly twice as much as during my PhD.
Only that in more than 3 years I never really got into demotivated/frustrated
phases. During my PhD I often felt depressed when sitting in the office all
day, never getting a bit of daylight... And paying 300€ a month to have
someone walk my dog.
I have lots of time for learning new stuff and also teach a course twice a
year at a local college for some extra cash.
Started when my daughter was born and never stopped ;). Couldn't be happier
and I really hope I can somehow keep it that way.
------
SteveGerencser
My dad ran his factory this way from day one. I forget exactly when they
started, late 80s I think, and they always worked 4 10 hour days, Monday thru
Thursday, 3 day weekend. Worked great when they went to 2 shifts as well. 4
hour a day downtime for maintenance to do their magic each day.
Employees loved it, management loved it, the only ones that complained were
some of their very large customers, and even they got used to it after a year
of not being able to reach anyone on Friday.
------
ksec
Well in many places and across many industry, it would be great if we could
get 5 days work week standard in the first place.
Instead, not only just tech industry in China, many are now moving more
towards 996.
~~~
FabHK
where 996 is 9am to 9pm, 6 days a week.
Jack Ma, defending 996:
“To be able to work 996 is a huge bliss,” China’s richest man said. “If you
want to join Alibaba, you need to be prepared to work 12 hours a day,
otherwise why even bother joining.”
~~~
dheera
Jack Ma is a massive hypocrite IMO. In another speech Jack Ma said:
"Independent thinking, teamwork, care for others, these are the soft parts.
Knowledge may not teach you that. That's why I think we should teach our kids
sports, music, painting, art -- making sure humans are different from
machines."
Yeah, the FIRST thing you need to do to do that is to end this 996 horror. 996
IS treating humans like machines. People already have kids and parents to care
for; if you want them to engage in sports, music, painting, art, they need
normal work hours.
------
disconnection
From my own research, I would say that 4 days is still too much in many cases:
[http://www.disconnectionist.com/blog/becoming-a-part-time-
su...](http://www.disconnectionist.com/blog/becoming-a-part-time-
superhuman.html)
------
hobo_mark
European here. This summer I'm taking all Fridays off until October (since I
had too many holidays left). Yesterday was the second time, I still showed up
in the office but just worked on my own projects all day (it's allowed). Once
I was in the flow, I've been doing the same on Saturdays (like today) and
Sundays (albeit shorter hours). I'm definitely making more progress than when
I was only working on it at night, try it if you have the opportunity (and no
kids/spouses of course).
------
throwaway82137
Data point from BigCo: almost no-one (at a US BigCo) would take reduced hours
for proportionally less pay.
I work at a silicon valley BigCo, and I work half-time (20 hrs per week). I
got this by just asking for it, though it did require high up approval. I
didn't even hide the fact that it was for the purpose of personal projects. I
get half my salary, and half my stock and bonus. It is still more than plenty
to afford living here for me. I have 1 kid.
I've done this for 5 years now, during which time I've spoken to many BigCo
colleagues about this, and I'd estimate about 100 of them know my "deal", and
so far no-one has followed my path. Note that 80% and 60% are also options.
When asked they uniformly say that they couldn't take the reduction in pay.
Median total comp for these people is easily 300k. Some don't even have kids.
So no, if you want to improve society you'd have to force a 4-day work week by
law, otherwise no-one will follow-thru.
------
bryanrasmussen
I can barely afford working 5 as it is, I have to level up my wages or get an
extra source of income, not decrease.
~~~
isostatic
One reason housing is so expensive now compared to the 50s is that there are
10 working days per household now, and were 6 in the 50s.
With more money, the price of limited resources increases to fill the void.
If a 3 or 4 day week is mandated, there’s less for housing, it’s the same
demand, the same supply, so prices have to come down.
~~~
bryanrasmussen
one reason that prices did not increase to fill the void immediately is that
systems have an information lag between events, thus I expect a few years of
mismatch. Furthermore some resources are going to increase in cost no matter
what because external pressures - climate change - are going to drive up the
costs of those resources (primarily food) no matter what.
------
sakisv
It's kind of disheartening to read things like this on one hand and on the
other hand read news that Austria "increased the flexibility" of the working
pattern by bumping up the limit of what is considered legal to 12 hours:
[http://www.mondaq.com/Austria/x/733020/employee+rights+labou...](http://www.mondaq.com/Austria/x/733020/employee+rights+labour+relations/Amendmends+To+The+Working+Hours+Law+The+12Hour+Working+Day+Light)
Given that this is a country which is in the EU I'm very much worried about
this being adopted by more countries and becoming the norm, all in the spirit
of maintaining our "competitiveness"
~~~
asdf21
What issue do you have with people working three twelve hour days?
------
purplezooey
The worst part of this is that it will, like most everything else in the US,
be an option for the wealthy, some white collar workers and those fortunate
enough to live in California while leaving everyone else behind.
------
tempodox
> Critically, they also say workers were 20% more productive.
So, there wouldn't even be an actual loss in productivity. Still I predict
this recommendation will fall on deaf ears. The problem is not economic or
rational, it's cultural and religious. “Society” still adheres to the notion
that a person's worth is largely determined by the extent of their economic
activities (independent of productivity, obviously). Working less is perceived
as a dangerous moral failure (“Idle hands are the devil's workshop”, in
christian cultures).
------
vermilingua
I’m sure this is great for some people, but what about casual workers? What
about people that rely on overtime to get by? I can only see this pushing
employers into further reducing hours, screwing those people.
~~~
coldtea
We shouldn't optimize society based on the needs of "people that rely on
overtime to get by", we should optimize so that people don't have to rely on
overtime to get by.
It's like someone was asking back in the day: "Abolishing child labor? What
about all those 10 and 12 year olds that depend on their job to eat?"
------
SubiculumCode
I submit that the problem is the denominator: 7
The ratios:
5|2. Just right and too little
4|3. Too little and just right
Weeks should be 8 days
5|3. Just right and just right.
7 has nothing to do with human health and happiness. Why stick with that week
length dogma?
8 days a week, thr Beatles had it right.
------
xtrimsky1234
Would be only great if my kids stay in school 5 days a week. Otherwise it will
be just another day I need to entertain my kids. Sure 3 day weekends for
traveling would be great, but I already do that with PTOs.
I want 6 hour work days 5 times a week. Not 4 work days. And banks being
closed an extra day would just be annoying. (not that I go to a physical bank,
but just transfers and such).
------
Causality1
I would be happy just to switch to a five day 40 hour work week.
------
johnmarinelli
A few months ago, I requested a 32 hour work week at my job. I took the
appropriate paycut (I'm a single bachelor that doesn't spend much anyway).
Instead of a 4-day workweek, I do ~6.5 hours for 5 days. It's been great and I
couldn't recommend it more.
edit - I should add that I started the job at 40 hours/week, and after 1 year
of being there I asked for 32 hours/week.
------
stmfreak
I have been trying to work from home one day per week and wish we could
standardize on this. It would reduce traffic 20%. It saves everyone a round
trip commute and that is time that can be put to productive use. I also
appreciate having a solid day of few meetings.
Four day work week would be fine, but I suspect people would just cram more
meetings in and it may become less productive.
------
chemmail
I've been doing 4 day work weeks and its like i'm on vacation all the time. We
need this.
------
jowdones
"The dogs bark, but the caravan goes on".
"the caravan" == employers.
------
mycall
I would love to but a 10 hour day would increase my time stuck in traffic 50%,
so I would actually lose free time overall.
------
deevolution
How do I go about convincing my company to give me a 4 day work week? Will I
need to sacrifice part of my wage?
------
RandomInteger4
How would this affect service workers?
------
triplee
I've been saying this for years, so I'm glad I agree with the experts.
------
ppcelery
we are switching to a six-day working week, in china...
------
amelius
Good news, except we've justed started a trade war with China.
------
amiga_500
But then my rent would drop 20% and I'd just be better off.
------
wrong_variable
The idea is good in principal but its going to be gamed to death.
Some people derive satisfaction from their work !
Others don't.
Also will the pay be the same ? Call me a cynic but knowing business owners
they will never accept lower hours for same pay. They will spend all their
time figuring out how to game it.
All these solutions, studies seem like busy work for though leaders who have
nothing better to do.
The best solution is quite simple - Universal Basic Income.
It can't be gamed as easily and by definition it costs less and gives
everybody more freedom at the same time.
Normally I feel like rules that restrict freedom ( like you can ONLY work 40
hours ), you have to be skeptical.
~~~
perfunctory
The same logic could be applied back in the day when 40-hour week was
introduced. Yet somehow we managed. There is no reason we can't do again.
~~~
wrong_variable
I never said the introduction of 40-hour week was bad.
Its just more economically efficient to have UBI.
| {
"pile_set_name": "HackerNews"
} |
Show HN: FundChan – funded channel messaging - jimbursch
https://fundchan.com
======
jimbursch
We are also applying for YC:
[https://news.ycombinator.com/item?id=11447417](https://news.ycombinator.com/item?id=11447417)
| {
"pile_set_name": "HackerNews"
} |
Top Russian Official Warns of ‘Catastrophic’ Population Loss - zeristor
https://www.themoscowtimes.com/2019/07/03/top-russian-official-warns-of-catastrophic-population-loss-a66259
======
8bitsrule
At the very bottom, article states (apropos of nothing earlier) that cancer
rates in Voronezh grew 20% since last year. The Oblast is located ENE of Kiev.
| {
"pile_set_name": "HackerNews"
} |
USB-C and MacBook Pro - tschellenbach
I wonder if I'm the only that's going a little crazy with the new MacBook Pro USB-C connections. I've experienced issues with my keyboard disconnecting, HDMI disconnecting, power disconnecting and display lag caused by the hub. Many hubs don't fully work, protocols are different etc.<p>So this is how I got my external screen running at 60hz while using USB-C on my 15 inch MacBook Pro:<p>Step 1:
I bought this screen (not an affiliate link) https://www.amazon.com/gp/product/B00YD3DBOC/<p>Step 2:
This cable https://www.amazon.com/gp/product/B074V5MMCH/<p>Step 3: I opened the menu on the monitor and changed the display port connection from Display port 1.1 to 1.2<p>Step 4: I connected my 2 USB-C connection on opposing sides of the MacBook (no clue why this matters, but it seems to make a difference). You'll also want to make sure you plug the charging USB-C cable directly into the MacBook. Some hubs slow down charging.<p>Step 5: I clicked to display preferences and clicked "scaled" while holding down the option key (adding this in case you're new to macs). My personal preference is 3200x1800<p>I was very happy when I finally go it to work smoothly. It's a little crazy how complicated this is though.
======
megasquid
Hello. I honestly haven't experienced any problems with USB c and my macbook
pro. I do have some different adapters than you do though. Here is what I'm
using. Did not require any custom display settings. Just plug and play.
2 HDMI to usb c adapters - [https://www.amazon.com/Adapter-MOKiN-Macbook-
Chromebook-Gold...](https://www.amazon.com/Adapter-MOKiN-Macbook-Chromebook-
Gold-
Plated/dp/B06ZYKQDC4/ref=sr_1_11?s=electronics&ie=UTF8&qid=1508801164&sr=1-11&keywords=usb+c+to+hdmi)
2 of these monitors -
[https://www.amazon.com/gp/product/B0148NNKTC/ref=oh_aui_deta...](https://www.amazon.com/gp/product/B0148NNKTC/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1)
| {
"pile_set_name": "HackerNews"
} |
Lego 60th Anniversary website down before its celebrating - app4soft
https://www.thebrickfan.com/lego-classic-bricks-on-a-roll-10715-collectible-booklet-promo-at-walmart/
======
app4soft
Quote from linked article, published by "The Brick Fan" on 25th of January,
2018:
> It looks like there’s a promotion at Walmart in which you can get a FREE
> collectible booklet and retro building instructions when you purchase the
> Bricks on a Roll (10715) set. All you have to do is get the set, go to
> www.lego60th.com (website down as of this writing)...
| {
"pile_set_name": "HackerNews"
} |
Speaking of the British - surfingdino
http://c.moreover.com/click/here.pl?z4596068789&z=1250248780
======
ErrantX
This is hilarious :) I honestly thought it was a joke... but now I am
uncertain.. he seems to _genuinely_ think that Britain is like this :S
The best bit is the wonderful caricature of Oxbridge interviews, as if the
modern world is left behind once you step foot in those hallowed halls :D One
guy at our school that went to Oxford worked like absolute hell to get his
grades, pass his entrance exam and pass the exhausting round of interviews (2
days, testing all sorts of aptitidue).
Sure; the colleges have a tradition of "sprawling on sofa's" while enjoying a
glass of the good stuff. But it is just tradition!
_Aged 18, perhaps hungover, you read out your pitiful but elegant essay. The
tutor points out gaps in your knowledge. For an hour, you talk your way around
those gaps._
Hahahahahahahahaha. Ahem. All of the friends I have that went to a really top
flight university (Oxford, Cambridge, Durham, etc) were basically working flat
out for their exams from about January every year..
(I was the drunkard.. see "winging it" below)
_Traditionally, elite Britons then leave education aged 21. Until recently
they rarely bothered with graduate school._
Meh, classic nonsense confusion regarding the British and American educations
systems.
_Britain’s rulers still struggle to judge scientific arguments about nuclear
energy or climate change_
Very much struggling to see American, or indeed any other countries, "rulers"
doing any better.
_It was the urge to amuse that recently prompted Cameron to riff on an old TV
ad and shout “Calm down, dear,” at a female Labour MP._
Failing to understand British humour, he only got it in the neck because at
that level of politics you have to work with international norms.
_Admittedly, ignorance sometimes saves Britain’s rulers from error._
And here we really get to the crux of it. We are not talking about "winging
it", or an aptitude for rhetoric, but "ignorance". I'll be first to admit that
the UK political elite are, to some extent, "clueless" on technical topics.
But... how is that any different from any other politician in the world? Is
that not why they have advisors? Hmmmm.
Nah, this is just some sneer-y anti-British hit piece. A caricature of the
great and powerful of Britian as bumbling idiots who get by through a wing and
a prayer.
But.
Winging it very much is a British tradition. One I have always been
consistently proud of, because we tend to be fucking awesome at it. I winged
it through school, university, life, career. And it works, brilliantly.
So, ba shucks ;)
I could go to America and make a casual, 2 second, assessment that the masses
were pretty crass (seriously, the amount, scale and volume of advertising on
your TV is terrible!), thick people and that the ruling elite were basically a
bunch of manic argumentative idiots. Because they are the most vocal or
visible types of people. I could go to France and make a similar assessment of
the people there as rude and xenophobic. etc etc. But that would be a
disgustingly shallow view. Quid Pro Quo. This is just thinly disguised
xenophobia and the FT should be ashamed to print it.
~~~
jdminhbg
The author, Simon Kuper, is British and studied at Oxford
(<http://en.wikipedia.org/wiki/Simon_Kuper>), so while he may be wrong, it's
not because he's unfamiliar with the country or the eduction system.
~~~
petesmithy
He has a British passport. He was born in Uganda of South African parents,
spent his childhood in the Netherlands, "also lived in Stanford, California,
Berlin and London", and studied at Oxford and Harvard. He currently lives in
France. He should stay there :)
------
corin_
This is a really horrible article, and it's clear that the author doesn't have
a great grasp of English culture by the bad generalisations he makes.
While I haven't bothered with University, I did go to an expensive private
school (a step below the likes of Eton) (fees paid by a scholarship), and I
have had some links to Oxford University which technically make me an alumnus
of an Oxford college and gave me the experience of the university without
having studied there (long story). So I'm pretty familiar with the people this
article is trying to talk about. For example I have been in the same class as
the son of Peter Hitchens, the brother of Christopher who is mentioned as an
example in the article, and also the same class as a boy from the Getty
family.
Overall, I would say that as a country we value talking and arguing highly as
skills, more so than many countries. It definitely becomes more noticeable the
higher you climb on the social ladder, but that is generally just because the
better educated you are, the more practise you get and the more you learn.
It's not that middle/upper class education teaches people these things, it's
just that, as with all subjects, private education tends to give people an
easier ride.
The idea that at OxBridge it's more about talking than knowing. This was much
more true before I was born, a few decades ago. It's certainly what many
people who went to those universities in the 50s-70s would say.
_"Traditionally, elite Britons then leave education aged 21."_ That has
nothing to do with being "elite", simply that the majority of people who go to
university don't study past their BA.
_"When Tony Blair hinted that Iraq’s 'weapons of mass destruction' could hit
London within 45 minutes, the establishment mostly believed him."_ Actually a
huge number of us didn't believe him, but we blamed it on lies rather than
mathematical difficulty. Many of us took to the streets in protest against
these lies.
_"Educated Americans would often praise Blair for arguing the case better
than President Bush could."_ If you have to draw comparison with GWB to prove
that someone is good at arguing a case then you've lost the argument already.
I don't know a single working class person who doesn't manage to appear more
engaged when speaking than Bush.
_"But in general, Britain’s ruling classes are funny speakers."_ Nonsense.
Britain as a whole tends to have a sense of humour, in my experience the upper
middle class has _less_ humour than lower classes.
_"It was the urge to amuse that recently prompted Cameron to riff on an old
TV ad and..."_ It wasn't the urge to amuse, and anyone who tries to suggest
that Cameron, or frankly any of our leading politicians, has a sense of
humour, hasn't spent enough time listening to them talk. That was Cameron's
attempt to appear hip and youthful.
_"Admittedly, ignorance sometimes saves Britain’s rulers from error."_ How
can such a claim be made without given a single example or a single piece of
logic to justify it. I don't know if it's a good or a bad point because,
without any context, I've no idea what he's even trying to say.
------
swombat
I went to Oxford. I studied Physics. Yes, I learned to "wing it". If you can
maneuver your way through a "tutorial" (2 students, one tutor) where the
person evaluating you is one of the smartest people in the world and an expert
with 40 years more experience than you, and you're hungover and didn't really
study the material properly, and still come out looking ok, you can deal with
almost any situation where you need to "wing it" in the future. It's solid
training.
That said, most of those students were very, very, very smart. To me, Oxford
was the place where I went from finding it fairly easy to get whatever grades
I wanted to struggling and working hard to get the minimum I found acceptable
(a 2:1, or Upper Second, result in the overall degree). That was because the
final grades were all relative, and there were a lot of very, very smart
students I was competing against.
And, as far as the subject studied is concerned, don't forget that in the UK
people rarely go on to work in the subject they studied. Sure, it's still
better to do hard sciences, but apart from lawyers and doctors, most people
end up doing something else after university.
Which is fine, imho - university is not and should not be training for a job,
but, instead, training for life. And as far as life is concerned, in the world
as it is today, winging it is certainly a more valuable skill than quantum
mechanics.
~~~
regomodo
Small case-study I know but I work with somebody who just came out of
Cambridge and did a CS-like degree.
He cannot program for shit and talks like he does but show him a bit of C(++),
PHP, javascript (even bash or Python) and his eyes just glaze over. I was
teamed with him but he was forever unable to get anything done. In the end I
was left with it all and made much better progress and he was left to book to
overheads.
I'm certain he's an edge case though, I've met a few others who went to
Oxbridge and they are definitely on the ball. However, I've found those who
went to Bristol, Imperial or Loughborough are people that are the smartest.
~~~
swombat
That's not a case study, just an anecdote. In my experience, there were almost
no people in Oxford that I wouldn't consider "very smart". Probably the only
place in the world where that's been true.
That said, I don't know how Cambridge's "CS-like" degree is/was, but Oxford's
was very bad at actually teaching programming.
Then again, are you really sure he went to Cambridge? Smart people have no
problem with learning to program when they need to. That guy doesn't sound so
smart...
~~~
regomodo
Yeah, anecdote is the correct term. He definitely went to Cambridge which is
why we questioned his choice of employer (the hirers decision). It doesn't
matter what language he's given a task in others are always curious as to why
he's taking so long or when it's ever going to be done. It's not like he's
making a cathedral of every task as ctrl+c & ctr+v are his key tools along
with a good smattering of ugly hacks(i get the feeling he has no grasp of OOP
or VCS).
He's comfortable in Access though.
------
pg
This phenomenon isn't limited to England. The one thing all Harvard undergrads
learn (probably as much from one another as from the professors) is how to
defend themselves in conversation. Some are smart, and some aren't, but by the
time they graduate they all know how to sound smart.
~~~
bane
A trait I've noticed when working with or arguing with the products of elite
universities is the incessant ability to respond to most challenges in the
form of a blank stare. Harvard grads in particular, but I've noticed it in
some military officers as well.
They won't get riled up, they won't back down, they won't...well...do anything
in particular.
While being incredibly infuriating, it also makes them nearly impossible to
read or assess. Are they smart? Are they dolts? No idea!
This enigma-like quality _can_ get one very far in certain contexts. The
ability to operate in elite positions, without becoming ruffled is a
tremendously valuable skill.
The downside of course is that if things really do go pear shaped, the normal
response is to show some kind of stress reaction and hopefully buckle down and
get to work. Not acting like there is anything in particular going on just
makes those around you wonder if the ol' gears are actually turning at all.
Are they out of touch? Do they care? It can be tremendously demotivating to
those around.
A company I worked for early in my carerr had to let two people with this
trait go (both top-tier uni graduates) because the management didn't think
they were taking a then current crisis seriously enough. We came to find that
with both of them, they were effectively doing no work at all as there was
almost no extra work that came out of their leaving.
When asked why they were kept around for so many years, promoted etc.,
management simply responded that they seemed well poised, like they knew what
was going on, attended tons of meetings, sent out lots of email...so it
_looked_ like they were burning furnaces of activity.
When the crisis struck, they kept plodding along this track, scheduling
meetings, refusing to let them run long, sending out dozens of emails a day,
long after the rest of the team had changed behavior patterns to deal with the
issue. It became such a nuisance, especially the content free, but tightly
scheduled meetings, that they were finally put under higher scrutiny.
It's a office-space-esque/dilbert-esque lesson I've kept with me through my
career and have run into this trait dozens of times.
I've personally found it so infuriating a trait to work with that I'm loath to
hire people from certain majors in certain universities under the assumption
there is some kind of communications coursework that teaches people to respond
with this blank stare.
I've coined the term "the management stare" for this phenomenon.
On the other hand, the people who I've found to be top performers were
consistently the dynamic, emotive people in an organization. They responded
rapidly and naturally to changing conditions, showed outward emotions and got
really emotionally involved in conversations, meetings and tasks. They brought
an energy to the company that drove the rest of their peers along with them.
~~~
nradov
I have consistently found the dynamic, emotive people to be disastrous in
crisis situations. They tend to act rashly without thinking through the
consequences and often make things worse. When faced with a technical problem
I prefer dispassionate experts who can apply a discipled, scientific approach
and never get flustered.
~~~
bane
I think I'm phrasing it wrong. I'm definitely not saying that people prone to
histrionics are good performers, just people who show that they are concerned
about the situation, have some leather in it, and want to fix it above all
else seem to be the people that actually get it done the best.
A rational approach to actually solving the problem is definitely best, I
agree.
A blank stare, followed by business as usual, is a non-starter in my opinion.
------
cstross
That article is a bizarre caricature, chock-full of misleading half-truths.
"You probably did school exams in just three subjects. At university, you only
study one" -- that's glaringly inaccurate. The underlying feature of the
British education system is that from age 16 if they're going to university
students tackle 3-4 subjects at "A" level, rather than a whole slew of them.
However, the depth of study for an "A" level qualification is supposed to be
(or was, in my day) roughly equivalent to finishing the first year of a degree
in that subject. The "at university you only study one" reflects the nature of
the British university system: rather than a hodge-podge of courses
culminating in a major, British students focus on a particular area from the
start -- other subjects are studied, but they don't rate separate examinations
or qualifications. It's a system based on specialization: narrow but deep
rather than broad but shallow.
Finally, anyone who went to Oxbridge having read that article and taking "nor
is workaholic study encouraged" at face value is going to be in for a very
nasty surprise ...
~~~
regularfry
This is really noticeable if you go through A levels, then end up in a
university course with a high foreign student intake. The first year is almost
_entirely_ catch-up, as those who didn't do A levels get the content pumped
into them so that the course proper can start in the second year.
------
tomsaffell
Some of this resonates, but this does not:
_Oxbridge’s teaching methods reward good talk_
In my experience, Cambridge University Engineering Department rewards those
who can _at age 19_ show _on paper_ a sound grasp of all branches of
engineering, from vector calculus, to materials science, to the physics of a
transistor, to thermo dynamics.
Those who can do both that, and also speak well, are often 'poached' into
consulting and banking, whereas those who can only do the former tend to
pursue more technical fields. But to blame that on Oxbridge seems unfair.
On an unrelated note: I've worked with many Oxbridge graduates of Humanities,
Classics (Greek, Latin) and English Literature - all of whom (by selection of
my firm's hiring process) are very numerate. I.e. the two are not mutually
exclusive.
~~~
sabraham
That said, I'm American, went to Oxford and am now at Cambridge, and I can
assure you that most Humanities graduates are shockingly innumerate--I
seriously doubt most Arts students given the Math SAT would be able to do more
than shake a stick at it. The best evidence is probably in PPE; the only
people who stick with E(conomics) did Math/s at A-level. The system doesn't
encourage it. You do your 3-5 A-levels, and people tend to focus on the Arts
or the Sciences, with little to no overlap.
Also, I was never offered sherry at a tutorial, but I did have friends
studying English who were offered wine~
On a related note, this is a good summary of the differences between English
vs American values/elite educations:
[http://www.yaledailynews.com/news/2010/apr/23/whats-
better-o...](http://www.yaledailynews.com/news/2010/apr/23/whats-better-
oxfords-depth-or-yales-breadth/)
------
hugh3
There's a lot to be written on the subject of the British (or just the
English, the other parts of Britain being a whole different ballgame) and
speech. Certainly I don't know of anywhere else where a person's manner of
speech tells you nearly as much about their geographical and socioeconomic
origin. (But of course, I only speak one language so my attention is pretty
restricted).
In Australia they say there are three accents: broad, general and cultivated,
and which one you have is mostly about your socioeconomic level. In my
experience, though, it's more of a spectrum, and the "broad" accent goes in
several different directions depending on where you are -- I can usually pick
out a Queenslander, for instance.
The most interesting thing I've noticed about American accents is that you can
often tell someone's _political_ persuasion, at least on the radio, if not
from their accent then from their manner of speaking. Flip through the radio
dial in some unknown area and you can instantly tell whether you're listening
to a right-wing or left-wing show -- the right-wing voice is deeper and more
aggressive while the left-wing voice sounds higher-pitched and a bit naggy.
------
Qz
This writer writes well. Perhaps unaware of the gaps in his knowledge, he
simply writes around them.
~~~
tankenmate
Ahhh meta analysis; the funniest comment I have read on HN for weeks.
------
adamt
I studied CS at Cambridge having previously come from a below average state
school. Although the article is a cliche of half-truths there is in my
experience some truth there. I had 2 interviews to get in which lasted only 15
and 60 minutes. I only made it to about 25% of lectures and talked my way
through supervisions. Through good exam technique and my own independent
study, I walked away getting a first in each year. So contrary to most people
here I think there is actually a degree of reality to the article.
------
bluekeybox
As the old joke (fact?) goes, Sir Winston Churchill was giving a speech in the
House of Commons and someone nearby noticed that there was a handwritten
comment in the margin of his notes: "Weak argument: talk loudly."
~~~
ErrantX
Churchill was not the smartest (from a technical perspective) cookie in the
cookie jar, he was not a careful quiet thinker, he was a bit racist, from the
"old school" and rather stuffy. But he was very very very very good at two
things; patriotism and rhetoric.
Which, when you are in a war, is a good thing. The generals win the battles.
The politicians win hearts and minds.
Think of the "talk louder" (I have heard to story too, and I think it is true
IIRC) as simply a technical aspect of his ability, akin to the hacks, short
cuts and pieces of useful code we programmers have hanging around to get
things done faster :)
~~~
gruseom
Churchill was also a master of English prose - quintessentially English prose.
He is like Dickens in this respect.
------
gmantastic
I think the article misses a point - in order to speak well, particularly off-
the-cuff, one must first be able to think quickly and clearly. Looked at this
way, judging people on how well they speak is a useful short-cut to assess
their thinking skills (although not definitive).
------
tomelders
I'm british, but I have to say; this article is utter arse gravy of the
highest order.
~~~
MrScruff
I'm also English. Calling something 'utter arse gravy of the highest order'
isn't a particularly informative form of criticism.
~~~
corin_
It is a rather English insult, though.
------
lhnz
This article is complete bullshit:
> Nor is workaholic study encouraged. A South African relative of mine started
> his first “supervision” at Cambridge by confessing that he hadn’t read every
> single book on the reading list. “Good God,” said his supervisor, “nor have
> I. I put them down hoping that you’d look at a couple, and tell me what they
> said.”
That's just completely untrue. Just talk to some student from Oxford and
you'll find that many of them are massive workaholics and very intelligent.
It's not simply all 'talk'. You can't succeed in that environment simply by
being good at talking!?!!
~~~
rplacd
"I disagree with this article! Here's a hypothetical that proves it - I know
it's true because I disagree with this article."
I'm all for nuanced positions, though.
~~~
lhnz
Indeed. Well, I'm not going to be able to create a good argument because mine
is also based on the people at Oxford (and Cambridge) that I've met and how
they've acted, too. (It could just be that I only meet certain kinds of
people?) However, I suspect the author of this articles is speaking falsely
because it doesn't match with the reality I've seen/heard. You have some of
the brainiest people in the UK there and you expect me to believe that it
always comes down to talk?
For instance, my brother comes back from Oxford and I see the amount of work
he brings back with him. He reads and writes more in one term than many other
University's expect from their students in a few years... There are folders
and folders of notes proving the workload. The reading lists that he has been
provided have _not_ been finished. This is because they're pages long and it
is impossible to finish them in the time they are given. He often spends full
days revising or reading, and especially now that is exam time he gives
himself very few hours to relax. Of course there is a mix of people: some that
literally work from 8AM till midnight and others that somehow seem to mix this
with being a socialite (perhaps those people don't do any work and just buddy
up to the professors by telling them that they do no work!? To me, that sounds
bizarre.) Despite the different kinds of people that are there I think it is
disingenuous for somebody with a relative that goes there to argue that
workaholic study is not encouraged. I have seen the opposite; I have seen
people under way too much stress and pressure to succeed and work hard.
Okay, not very nuanced but maybe you see why I call it bullshit.
~~~
rplacd
I have admit - I have a cousin that's been through Cambridge (went through a
nanotechnology track, now taking a doctoral at Max Planck, will always feel
inferior to him) and I do agree with you: he's barely poked his head out and
reconnected with the extended family in ages. I'm interpreting this as
"complete devotion to work" because complete disconnection is rather unheard
of in the Asian family he comes from. So my observations line up with yours to
some extent. I can't prove the existence of lack of of the opposite,
annoyingly enough.
------
will_critchlow
Yep. This rings true even of technical graduates. Makes them really hard to
interview because they are so good at _interviewing_!
I think it's an under-rated skill outside oxbridge. I certainly think it's
helped me and I only have the very edges of this skill that I picked up at
Cambridge.
~~~
tomsaffell
>Makes them really hard to interview because they are so good at interviewing!
Hmm. In my experience interviewing Oxbridge grads, I didn't find that. Being
good at talking around gaps in knowledge works very well in free flowing
conversations, but if an interviewer ask a specific question, and the
interviewee cannot answer it, then the ability to talk round it does no good.
E.g. I used to ask a question relating to the amortization of a loan, and I
would ask the interviewee to draw a graph of _time vs. loan balance_. You
can't talk your way out of not being able to do that.
~~~
will_critchlow
You're right. I'm talking about the ones with the pre-requisite knowledge, not
the total blaggers. It's hard to address their answers to the questions that
don't have right or wrong answers because they answer so well.
------
pclark
Learning how to speak and articulate what I am saying has been the most
valuable skill I have ever learnt.
I am constantly blown away by how poorly grown adults converse.
------
TheBoff
As a current Cambridge undergraduate studying computer science, I find this
absolutely outrageous.
The article seems to imply that the scientific fields simply aren't studied
here.
The particular example of Lord Cherwell is particularly misleading, as his
results were mistrusted by other scientists of the day.
Also, the application process is completely misrepresented. I had two
interviews, with the people who would be supervising me. I didn't get offered
sherry, they were sat respectably in chairs, and they asked me maths and logic
questions.
My director of studies informs me that he then runs all our results
(interview, A levels, personal statement) through a number of statistical
tests to work out who are going to be the best candidates.
Also, it's strange, I didn't think that Bill Gates and Mark Zuckerberg had any
effect on the legislative direction of America...
And the point about "workaholic study is not encouraged" is absolutely untrue.
The full reading list probably consisted of about 3 pages of literature. The
workload here is high: it's certainly not unknown to have a fifty hour working
week here.
------
paulnelligan
as an Irishman, it's instinctive to have mixed feelings about the British, our
once brutal, now friendly neighbour.
In any case, they must be doing something right. They have some of the best
institutions on the planet - the NHS, the BBC, and yes their Education system.
And their police force is really the best in the world. I've never met a Bobby
who wasn't professional, friendly, and reasonable, despite a very difficult
job.
------
mattmanser
Odd that he mentions Margaret Thatcher without mentioning she was once a
research chemist and had a chemistry degree.
Doesn't fit with his content free story I guess.
p.s. the link is bad, it's not moreover.com it's from the ft.com.
~~~
srgseg
The moreover link helps avoid the FT paywall.
~~~
mattmanser
ah, fair enough then!
------
nagrom
So for those of us who didn't go to oxbridge and didn't acquire this valuable
skill, how do we acquire it as adults?
~~~
corin_
I'm pretty sure that if you put on an upper class accent (think Hugh Laurie)
that would be enough to make the author of this article assume that you have
the skill.
~~~
wyclif
You mean RP: <http://en.wikipedia.org/wiki/Received_Pronunciation>
~~~
corin_
I've never actually heard that term before, though I am familiar with it being
called the Queen's English, and BBC and Oxford English.
But commonly it would be considered "upper class" within England, and probably
considered "English" outside the UK.
~~~
wyclif
Yeah, I've heard it called that, but my Brit friends usually call it "posh."
Incidentally, we have our own version of RP in the States, it's called the
"Mid-Atlantic accent", many talking heads on news and radio are encouraged to
imitate it if they come from the South of the US or New England.
------
srgseg
"Anyway, running a country on eloquence alone hasn’t worked out disastrously –
or at least not yet"
Good grades are certainly not enough, but they are almost always required.
Oxbridge only invites the best students to interview, and even the top 5 most
academically competitive schools in the UK struggle to send more than a third
of their students to Oxford or Cambridge in a good year.
~~~
raffaelc
"Anyway, running a country on eloquence alone hasn’t worked out disastrously –
or at least not yet"
The largest empire the world has ever seen whittled down to its homeland, a
naval base in the med, and some godforsaken islands in the south atlantic. I'd
call that pretty disastrous.
It'll only get worse. In an increasingly technical world, the brits have the
wrong culture of Snow's two cutures running the show.
~~~
AlisdairO
> The largest empire the world has ever seen whittled down to its homeland, a
> naval base in the med, and some godforsaken islands in the south atlantic.
> I'd call that pretty disastrous.
Actually, I would count the graceful degradation of the British empire (i.e
without the entire country going to hell) as quite an achievement -
particularly considering it happened with a surprisingly low level of long-
term resentment incurred from the former colonies. Further, the claim that it
was due to poor management more than, for example, two ludicrously expensive
wars, is more than a touch disingenuous.
I suppose one could argue that the first of those wars was rather a waste, but
Britain was hardly the only country to be affected by that particular
insanity.
------
vixen99
There's a touch of truth in the article but for an ignorant elite they don't
do too badly. They are around 4th in line for Nobel Prizes in hard sciences
per inhabitant and well ahead in terms of large countries. The top 'winners'
are Sweden, Switzerland and Denmark all with populations under 10 million
(1999) though I am not suggesting that detracts in anyway from their
achievement.
------
petesmithy
This guy should stick to writing about sports.
Although I agree that the English value both 'winging it' on limited
knowledge, and banter / witty repartee. But that is regardless of background.
------
oceanician
Haha, the ending comments - yes, the lack of finance & maths knowledge really
has impacted. It's soo frustrating that those in the cabinet office here don't
even have basic maths knowledge.
I've no idea how this countries decline can be reversed. Knowledge based
economy? What a joke :(
I wonder if there's any page listing the qualifications of all those in power
in the UK. Have their grades deteriorated over time I wonder haha.
------
code_duck
Why is this a link on c.moreover.com and not ft.com? All the sharing links at
the bottom of the page use an ft.com URL.
~~~
kmfrk
It circumvents the paywall.
------
16s
Seems silly to suggest that articulate people are uneducated. If that's not an
oxymoron, what is?
------
epo
It's probably a humour piece.
The trouble is that many foreigners tend to take it at face value because it
panders to racist stereotypes they hold about the British.
------
surfingdino
I tried to get into Cambridge. The first question I heard was "What make of
car does your father drive to work?" I told them and they politely rejected
me.
------
patkai
By the way, does the FT have editors or something?
| {
"pile_set_name": "HackerNews"
} |
New Harvard Research Reveals a Fun Way to Be More Successful - rarjunpillai
http://www.bakadesuyo.com/2014/09/be-more-successful/
======
poseid
intersting - thanks for sharing :)
| {
"pile_set_name": "HackerNews"
} |
Hypernetes: Bringing Security and Multi-Tenancy to Kubernetes - scprodigy
http://blog.kubernetes.io/2016/05/hypernetes-security-and-multi-tenancy-in-kubernetes.html
======
switchbak
They mention that this helps cure some issues with regards to resource sharing
/ memory usage, etc. But does each VM still have a static allocation of
memory?
One of the main benefits I have now is that if I run a number of containers
that all take various amounts of memory, I can just throw them on and they
share memory amongst each other quite efficiently. If I have to make a static
allocation of memory for a VM, I'll typically choose a conservative memory
number, and usually under-utilize the machine, wasting a lot of memory per-
instance. Not so bad since I chose per-pod, but still an issue.
As it happens, this same issue is why I'm leaning towards lightweight native
applications these days instead of an aggressive greedy virtual machine that
grabs a bunch of heap. Golang/Rust in particular.
~~~
scprodigy
Actually, scaleup is pretty easy for both VM and Linux container. But scale-
down is very troublesome for both.
And the scheduler will need the mem size (not right now, but inevitable)
------
sarnowski
That is what rkt with kvm is essentially doing as well, correct?
[https://coreos.com/rkt/docs/latest/running-lkvm-
stage1.html](https://coreos.com/rkt/docs/latest/running-lkvm-stage1.html)
~~~
mjg59
Not quite - rkt will bring up a new VM for each container, this approach only
brings up a VM per pod (ie, a set of functionally related containers).
~~~
philips
This isn't correct. rkt does a VM per pod.
------
andrewstuart2
This really does not appeal to me at all. The major point of docker containers
is not the image format, it's that the kernel can allocate resources more
intelligently. VM images work just fine for "shippable images."
What I'd rather see is an allocation layer for physical resources that just
cordons off the whole machine (physical or virtual) by tenant as soon as
previous tenant resources have been fully consumed, then reclaims hosts after
usage subsides. So as a provider I still only have one cluster to manage, but
as a consumer I still don't worry about _another_ layer of abstraction slowing
things down or pre-allocating resources.
~~~
chatmasta
I'm interested in the economics of (docker) containers vs. virtual machines.
Containers can run within a VM, but a VM can only run within a hypervisor.
Currently, if you want to resell computing resources, you need to rent or buy
a dedicated server, and run a hypervisor on it.
Containers enable a new class of reselling computing resources. Because you
can run a container within a VM, you can resell computing capacity on a VM.
I think we are going to see another abstraction on top of "the cloud," due to
this additional layer of reselling (new russian doll on the inside, new doll
on the outside).
The physical abstraction is:
Datacenter > Floor Space > Server Rack > Server
The virtual abstraction is:
Server > VM > Container > Kubernetes|MESOS|...
Virtual is a 1:1 inverse of physical. Next step is datacenter interconnections
(i.e. multihost kubernetes or whatever flavor of the month IaaS software
people use).
~~~
derefr
People have been reselling containers without needing an intermediary VM
abstraction forever; look at any cheap VPS host offering OpenVZ-based "virtual
machines"—which are actually [resource-quota'ed] containers.
~~~
chatmasta
It's not a question of need; it's a question of ease of opportunity. It's now
easier to virtualize via container, and there are more opportunities, since
it's easier to get a VM than a dedicated server with hypervisor access.
------
markbnj
Can someone clarify how this compares to LXD, released by Canonical in Ubuntu
16.04? A lot of the keywords and concepts seem similar.
~~~
resouer
And another different side of HyperContainer is that it follows OCI spec,
check the runv project here:
[https://github.com/hyperhq/runv/](https://github.com/hyperhq/runv/), so
technically speaking, it's a hypervisor version of OCI, just like docker is a
linux container version of OCI. Seems rkt/clear linux or LXD does not.
~~~
shykes
Actually it's _runc_ that is the linux container version of OCI. Docker is a
higher-level abstraction which calls runc by default but can call any OCI-
compliant runtime, including runv.
See [https://blog.docker.com/2016/04/docker-
engine-1-11-runc/](https://blog.docker.com/2016/04/docker-engine-1-11-runc/)
~~~
resouer
It will be great to see if I can use dockerd start runv containers!!!!
------
Veratyr
Speaking of security on Kubernetes, it's worth noting that most of the
"Getting Started" guides (e.g. [0]) to help you set up a cluster result in
completely unauthenticated API servers.
This means that by default, anyone can do anything they want with your
cluster.
There are no warnings, no suggestions that turning on the much better TLS
based authentication would be a good idea (or even how to do it), no nothing.
Be __very __careful with Kubernetes.
[0]: [http://kubernetes.io/docs/getting-started-
guides/ubuntu/](http://kubernetes.io/docs/getting-started-guides/ubuntu/)
~~~
robszumski
Maintaining community-led documentation is a hard and time consuming process,
especially with a young project. I encourage you to get involved if you have a
few free cycles.
Sometimes you have to take a stance on these types of things, as we have done
with the CoreOS + Kubernetes community guides [0]. The guides are open source,
but full TLS, passing conformance tests, etc is required for contribution.
(I work at CoreOS)
[0]:
[https://coreos.com/kubernetes/docs/latest/#installation](https://coreos.com/kubernetes/docs/latest/#installation)
~~~
praneshp
The CoreOS documentation is a lifesaver (even when setting up k8s on a non-
coreOS system. Thanks a lot (and agree with the comments on
documentation/involvement).
------
jldugger
At a brief glance, this looks comparable to Magnum:
\- containers \- openstack \- multitenancy
~~~
scprodigy
Totally not!
------
resouer
Aha, the
chancellor([https://www.youtube.com/watch?v=PivpCKEiQOQ](https://www.youtube.com/watch?v=PivpCKEiQOQ))
will feel great to see that finaly he can eliminate IaaS/vms and use docker in
production env.
------
xbeta
Anyone knows whether there's similar thing for Mesos ?
~~~
scprodigy
A patch is available:
[https://issues.apache.org/jira/browse/MESOS-3435](https://issues.apache.org/jira/browse/MESOS-3435)
| {
"pile_set_name": "HackerNews"
} |
WTF is OpenResty? - garyclarke27
http://www.theregister.co.uk/2016/09/20/wtf_is_openresty_the_worlds_fifthmostused_web_server_thats_what/
======
mitendra
Was closely following OpenResty and it's interesting echo system. Good to see
its growing footprint.
| {
"pile_set_name": "HackerNews"
} |
How Torch broke ls and made me vulnerable - joshumax
https://joshumax.github.io/general/2017/06/08/how-torch-broke-ls.html
======
binarycrusader
Torch probably doesn't even need to set LD_LIBRARY_PATH. If LD_LIBRARY_PATH is
only being set so that binaries distributed by torch work, then I'd strongly
suggest they use RUNPATH instead with $ORIGIN.
There are examples in various places:
[https://enchildfone.wordpress.com/2010/03/23/a-description-o...](https://enchildfone.wordpress.com/2010/03/23/a-description-
of-rpath-origin-ld_library_path-and-portable-linux-binaries/)
[http://man7.org/linux/man-pages/man8/ld.so.8.html](http://man7.org/linux/man-
pages/man8/ld.so.8.html)
[http://longwei.github.io/rpath_origin/](http://longwei.github.io/rpath_origin/)
LD_LIBRARY_PATH is really only for a developer's local use; it should never be
used for installed software.
Disclaimer: may not apply in some scenarios, I haven't used Torch, so this is
merely a general observation.
~~~
catdog
This nice and clean solution is too little known I think. Far better than
shipping shell scripts which are very hard to get right and most application
developers are not shell scripting experts.
~~~
pjc50
RPATH is very nice, but it's a _huge_ pain to set in Makefiles because you
have to reliably escape the "$ORIGIN"
~~~
Spivak
(Don't do this) You could patch the binary after compilation with elfpatch.
Also,
test:
echo "\$$ORIGIN"
outputs $ORIGIN. $$ translates to a literal dollar sign. \ escapes the dollar
sign in the shell.
~~~
binarycrusader
You can use elfedit instead, but really, it's not that much of a pain to just
do it right in the first place.
------
anderskaseorg
How to safely prepend a directory to a PATH-like variable (in any POSIX-
compliant shell):
export LD_LIBRARY_PATH=/opt/whatever/lib${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}
Pull request sent to
[https://github.com/torch/distro/pull/228](https://github.com/torch/distro/pull/228).
~~~
djsumdog
Good fix and good for sending them a merge request.
I still find it kinda baffling glibc would have this behavior for a trailing
colon (:). Like, I know it's probably legacy/comparability, but it feels like
a security nightmare. ./ should be explicit, not implicit.
~~~
emmelaich
Also for leading colons. (but you probably knew that)
------
AceJohnny2
More concerning to me is that ld.so will interpret a trailing `:` in
LD_LIBRARY_PATH to mean to include PWD.
Where is this documented? It's not indicated in ld.so's manpage:
[http://man7.org/linux/man-pages/man8/ld.so.8.html](http://man7.org/linux/man-
pages/man8/ld.so.8.html)
Sounds like a bug in GNU's ld.so more than anything.
~~~
zwp
> Sounds like a bug in GNU's ld.so more than anything.
It's neither unique to glibc (AIX, Solaris) nor to LD_LIBRARY_PATH (PATH), nor
trailing colons (leading colons, adjacent colons).
This de facto standard becomes a little more obvious when one considers a
likely implementation (iterating over "strchr(arg, ':')" or whatever). Any of
these sequences then will give up an empty string:
PATH=:/foo
PATH=/foo:
PATH=/foo::/bar
And an empty string is equivalent to dot for chdir(2).
zwp:/tmp$ cd ''
zwp:/tmp$ pwd
/tmp
zwp:/tmp$
(This is not the same as plain "cd" (ie with no args), which is a special case
that takes you $HOME, of course).
I agree it's surprising and potentially dangerous.
FWIW, the exec _p_ () functions hide a similar wtf. From the Linux man page:
The file is sought in the colon-separated list of
directory pathnames specified in the PATH envi‐
ronment variable. If this variable isn't defined,
the path list defaults to the current directory
followed by the list of directories returned by
confstr(_CS_PATH).
Security conscious programs that clear the environment and then call eg
execlp() end up searching dot before the system path. Yay.
------
lloydde
Buried in there is "Torch machine learning library", which then allowed me to
figure out what software was being blamed.
[http://torch.ch/](http://torch.ch/)
~~~
ChristianGeek
At least you didn't read "ls " as "is" with a capital "i"...I only read it it
to un-confuse myself!
~~~
bartread
Same. I was irritated enough by the badly formed headline to read the story to
try and figure out what was going on.
What's wrong with something like, "Torch machine learning introduces
vulnerability in loading of shared libraries." Whilst it doesn't tell the
whole story it does at least give a flavour instead of just sowing confusion.
------
matheweis
On most modern Mac OS installations, this is a non-issue. System Integrity
Protection doesn't honor any changes to LD_LIBRARY_PATH, presumably for
exactly this sort of reason. (Of course, one might have turned off SIP, in
which case this is no longer true, but it's nice to know it's the default).
~~~
HappyTypist
Very sensible. I wish OS X featured a even more "rootless" mode where "root"
only gives you sandboxed write access, and read access to files it creates.
------
thewisenerd
on a similar note, this (having '.' or $(pwd) on LD_LIBRARY_PATH) also broke
the `ls` command (and a bunch of other stuff) in the TeamWin Recovery Project
(TWRP) recovery on mobile devices when you were in `/system/lib` on a 64-bit
machine.
[https://github.com/omnirom/android_bootable_recovery/commit/...](https://github.com/omnirom/android_bootable_recovery/commit/9c3b7e990e162319cb379545b458838478a19eb0)
------
andreiw
you know, that's a pretty good plausibly-deniable backdoor, if you think about
it...
sounds like a pretty good thing to disable in ld.so...
~~~
eridius
How is it a backdoor? System services don't typically source the user's bash
profile before running, and even if they did, they don't run from attacker-
controlled directories anyway. At best you could compromise someone by
tricking them into cd'ing into a folder you provided, but that's not something
that would generally be called a "backdoor". And if you can get them to run
your install script, you've already "compromised" them anyway and
LD_LIBRARY_PATH is completely unnecessary.
~~~
lloeki
> And if you can get them to run your install script
If your script is obviously malicious then you're reducing your chances. Such
a change could seem innocuous[0], then, cloning a repo containing a so file in
the middle of a long list and cd'ing would trigger payload execution.
Distributing the maliciousness by chaining innocuously looking actions is both
effective at bypassing human logical analysis and plausibly deniable (up to a
point).
[0]: [http://underhanded-c.org](http://underhanded-c.org)
~~~
eridius
If you clone a repo and cd into it, that's because you're going to actually do
something in there. An install script that clones a repo, cd's into it, and
then does nothing is extremely suspicious. But a script that clones a repo,
cd's into it, and runs `make install` isn't particularly suspicious, so once
again, there's no need for LD_LIBRARY_PATH.
------
Houshalter
Why is torchs install so weird? Why can't it use standard package management,
and why does it need to install to my home directory? I'm not surprised to see
it causes security issues.
------
tebruno99
So I'm less concerned about Torch and more concerned about GNU ld adding
things unexpected. This sort of "magic" shouldn't occur.
~~~
emmelaich
It's not specific to GNU ld, it is standard UNIX behaviour. Always has been.
But you're right it should not occur.
------
foota
Seems like this may almost have been better done through a disclosure channel
with torch?
~~~
mannykannot
Maybe, but this particular issue has much wider scope, and is only
incidentally a Torch issue. A disclosure by the Torch devs might have gone
unnoticed by those who are not Torch users - I only read it because the HN
title mentioned ls, and I thought "that looks odd...".
------
fslkjhjdfhgj4j
wow! thats a gotcha, trailing : appends $(PWD) to the LD_LIBRARY_PATH
Thanks for sharing!
------
IshKebab
Ha, shitty text-based configuration systems strike again. Ask yourself if this
could have happened with Windows 10's PATH editor.
~~~
JadeNB
> Ask yourself if this could have happened with Windows 10's PATH editor.
Yes …? Well, I dunno; I don't know how Windows 10's PATH editor works.
Nonetheless, the issue seems to be with magic interpretation of special
configuration options, not with how those configuration options are entered.
(Note also that the configuration was done programmatically, not by the user,
so that there would have to be some kind of parse–deparse step anyway.)
| {
"pile_set_name": "HackerNews"
} |
Bad Programmers Are Not Good Programmers Who Are Slow - iamelgringo
http://www.knowing.net/PermaLink,guid,f6755acf-e8df-4f32-8d53-39b9a01992f5.aspx
======
xenoterracide
What if the guy who takes 5X to do the work has less bugs and more efficient
code and secure code? just because you code faster doesn't mean your code is
better, in fact it is probably worse, because you didn't take your time.
I'm merely saying that the speed you produce code is not directly related to
it's quality. Release early and often does have it's merits. But that doesn't
mean whip code out in an hour, and then fix bug after bug after bug.
~~~
pchristensen
That's the point (even inferable from the title) - he'd take a good programmer
who was slow (like you describe) over the bad programmer, who is slow, and
writes bugs, and takes a long time to fix those bugs, and makes more work for
other people. Fast good programmer > slow good programmer > fast bad
programmer > slow good programmer.
~~~
jimbokun
"Fast good programmer > slow good programmer > fast bad programmer > slow good
programmer."
Got a cycle in your graph :).
~~~
pchristensen
Yikes. Let's try again.
"Fast good programmer > slow good programmer > fast bad programmer > slow bad
programmer."
Looks like that makes me a "fast bad programmer". Not bad for third place.
------
jfalk
Here is an article I like a lot that shows just how bad a bad programmer can
be. <http://www.codinghorror.com/blog/archives/000781.html>
Basically to sum it up, the author mentions a very simple question that he
asks potential programming candidates to solve. This kind of question is the
kind of question you can solve by making it through two/three chapters in any
programming book, yet the number of people, even senior level developers, that
can't get it is astonishing.
~~~
fendale
I remember that article - I quickly reassured my self by writing fizz-buzz in
Perl, Ruby and PLSQL (my three most used languages) - no great achievement,
but I am better than the majority of comp-sci grads apparently!
------
edw519
Joel Spolsky has a real nice treatment of this:
<http://www.joelonsoftware.com/articles/HighNotes.html>
My favorite line from that:
"Five Antonio Salieris won't produce Mozart's Requiem. Ever. Not if they work
for 100 years."
Nice analogy to programming.
~~~
henning
Yes, it takes a real genius to work on bug tracking and project management.
~~~
mrtron
Accidentally hit the up arrow instead of the down, so I will respond.
His bug tracking/pm software is actually quite good. He doesn't actually write
software anymore from what I understand. And attacking him personally doesn't
change the validity of his point.
~~~
henning
How did I attack him personally? I didn't say he was stupid. I just don't like
his ridiculous, impossible elitist attitude towards hiring wherein, if you
take him at his word, approximately no one is good enough for him. Talking
about famous works of high art when you make your living off server-side
software that integrates email and version control through the browser is
ridiculous, too.
The lesson of his business is that you should try to produce boring, solid
software that solves problems in a very simple, straightforward fashion using
a handful of good ideas (maximize the probability of getting bugs in by, e.g.,
not having any required fields; prevent the improper use of metrics by not
adding much reporting functionality). He has long, old release cycles that
ensures polished, stable software. This is not rockstar genius ninja work at
all.
The Javascript wankery that's far beyond the capabilities of typical code
monkeys (the spellchecker, dragging and dropping columns in tables, etc) is
the least valuable part of FogBugz.
~~~
dreish
I'll take a stab at an explanation.
This guy went to the trouble to gather data, chart it, try to show his readers
something interesting, and your response is, "He produces dull, reliable
software, so he can't know anything about who is a good software engineer."
You did not address his point, the validity of his data, or his reasoning. You
simply said that his argument is not valid because of who he is and what he
does for a living.
Maybe if you'd said something like, "To the extent Spolsky's essay implies
that Fog Creek is a place where geniuses crank out the Requiem of software on
a daily basis, I disagree because X, Y, and Z," it might not have come across
as an ad hominem attack on his point about the differences between good and
bad programmers. Frankly I don't think many YC readers care about Fog Creek --
they're here for the more fundamental wisdom about programming.
~~~
carterschonwald
I think one point thats worth pointing out is that in a lot of Spolsky's
opinions regarding whats good and bad CS seem to be highly influence by the
fact that he couldn't cut it in terms of theoretical computer science
coursework back in college, or at least thats the sense i get from his
writings....
| {
"pile_set_name": "HackerNews"
} |
The cost of a logo - j21
http://www.huhmagazine.co.uk/4140/the-cost-of-a-logo
======
larrys
This was already posted here there other day:
<http://news.ycombinator.com/item?id=4401018>
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Are there any services for collecting emails pre-launch? - Banekin
I'd like to build up a little hype for a game I'm making by having a landing page with a video, and collecting emails before I launch.<p>Are there any good services that create something like this? I would make the page myself, but I think my time is better spent improving the app.<p>Thanks
======
mindcrime
Service to collect emails via a form you can embed in a page, and manage said
mailing list? Yes. You can find a few mentioned on this page:
<http://steveblank.com/tools-and-blogs-for-entrepreneurs/>
FWIW, I chose MailChimp for that, but there are plenty of other choices.
If you want something that goes even further, that is, doing the page, video,
and everything, plus email, then the answer is "I'm not sure." Probably there
are, but I haven't really looked.
You might find value in something like LaunchRock.com, but I'm not sure
they're exactly what you're looking for.
~~~
katherinehague
I'm a fan of MailChimp.
------
jaymstr
Jameson from LaunchRock here. We're literally days away from doing a full roll
out, but hit me up at [email protected], and I'll get you an immediate
invite. In order to do video, you'll need to use the widget on a page.
------
jvdmeij
<http://launcheffectapp.com> \- A Wordpress theme for viral launches. Haven't
tried it, looks nice though!
------
ahsanhilal
Just add a wufoo form:
<http://wufoo.com/>
I think you can redesign parts of the form with some basic html/css, to theme
it according to your designs.
------
brianbreslin
Are you talking about something like <http://launchrock.com> ?
~~~
Banekin
Yes, but it looks like it's invite only right now?
~~~
samgro
You just need to sign up 3 email addresses from your invite link. Nothing
stops you from doing that with 3 of your own addresses.
------
andrewtbham
<http://www.unbounce.com>
~~~
ahsanhilal
That pricing is ridiculously expensive...
| {
"pile_set_name": "HackerNews"
} |
This German Invention Supposedly Makes the Best Coffee You've Ever Tasted - prostoalex
http://www.businessinsider.com/r-crowd-funded-coffee-machine-touts-taste-through-tech-2014-12
======
mmastrac
"Coffee bags will contain a microchip to start the machine and dictate the
perfect brewing process."
Uhh pass. This is just a fancier and more expensive Keurig.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Inbox evolved: reach new levels of productivity - DarwinMailApp
https://www.producthunt.com/posts/darwin-mail
======
DarwinMailApp
Hello HN
I’m Joey, the maker of Darwin Mail.
—
Darwin Mail aims to help you be your most productive when dealing with emails
& todos.
Problem Inbox by Google was one of the best products they ever made. And then
they shut it down.
Solution Introducing Darwin Mail, which aims to replace and become better than
Google Inbox ever was.
Features \- Snoozing, Reminders, Dark Mode, Undo Send, Custom Backgrounds,
Templates, & much more according to your requests!
[https://www.darwinmail.app/feedback.php](https://www.darwinmail.app/feedback.php)
[https://twitter.com/joeytawadrous](https://twitter.com/joeytawadrous)
Darwin Mail will evolve to become great over time, thanks to its users, and
thanks to you.
You're welcome to join me on this journey ️
| {
"pile_set_name": "HackerNews"
} |
Ask HN: How can you know when you've handled all edge cases? - patmcguire
======
ISL
You don't.
It is sufficient to handle every one you can imagine, and every one a
reasonable amount of testing can encounter.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Best HTML5/Bootstrap/whatever WYSIWYG web editor? - TXV
I need to prototype a bunch of web pages quickly. Actual minimally interactive web pages with decent layouts where I can wedge some JS in, but I don't want to waste time cooking HTML and CSS stuff. Thanks
======
brryant
Webflow CTO here. Check out webflow.com. Lots of tutorial content as well to
help you lay out pages: [https://university.webflow.com/lesson/intro-to-
flexbox-layou...](https://university.webflow.com/lesson/intro-to-flexbox-
layout)
For a quick preview of what it's like to design in webflow, check out
[https://flexboxgame.com](https://flexboxgame.com)
------
applecrazy
Webflow is the only solution to come to mind right now. They have the power of
PS with and high quality HTML export. I’ve heard it’s pretty good.
(I’m not affiliated with them)
| {
"pile_set_name": "HackerNews"
} |
Lambda School’s Misleading Promises - uptown
https://nymag.com/intelligencer/2020/02/lambda-schools-job-placement-rate-is-lower-than-claimed.html
======
songzme
I'm not a fan of bootcamps because I think a lot of them are more focused on
making money than actually helping people who need help.
Last year I decided to take action and started a free coding group at our
local library: [https://www.meetup.com/San-
Jose-C0D3](https://www.meetup.com/San-Jose-C0D3)
I show up before work every day (M-F at 8am) to help students who are learning
how to code. So far no students have gotten a job yet, but our group
consistently gets 4-8 students who show up promptly at 8am. I answer their
questions, give them guidance, and teach them best practices I follow as a
software engineer with 10 years of work experience. I ask for nothing in
return except the joy of students going "ahhh" when something clicked for
them.
Things are still early for us, but my dream is to inspire other software
engineers to help create a free and open learning center at their local
libraries so people have an alternative to coding bootcamps.
~~~
chrisseaton
> I'm not a fan of bootcamps because I think a lot of them are more focused on
> making money than actually helping people
This idea doesn't pass a common sense test to me - I'm sure bootcamps can be
profitable, but the people running them are used to building things that
scale. Bootcamps definitely don't scale. If these tech people were looking to
get-rich-quick they surely wouldn't be running a school, of all things, even a
profitable one.
~~~
glenngillen
I did some back-of-the-envelope numbers on the one I came into contact with
previously (also keep in mind this was 6 years ago now too!):
20-30x students per cohort who paid ~$20K upfront for a 12 week program. 20%
signing fee (based on 1st year comp) from employer on placement.
We definitely were not paying top of market as some of these students ended up
at Uber and Facebook. That said the all in 1st year cost between base +
signing bonus + equity wasn't much short of $200K. So:
30 * $20K + 28 * $200K * 20% = $1.72M/cohort
As for outgoings, all of the mentors were volunteers. As were most of the
instructors. The content is mostly a one-time sunk cost to produce and is
redelivered across cohorts. The largest overhead would have been a building
lease. The biggest constraint on growth is how large you can make a cohort or
how many cohorts you run (either multiple per year, or opening new locations).
Really felt like a bit of a racket that had found what was almost an
arbitrage: between the inability of Bay Area companies to find local talent,
the huge costs and risk associated trying to relocate people via H1B, and the
desire for people to re-skill at any cost because tech jobs/salaries were
distorting everything else in their city.
Sure it's not a $1B outcome. It's a pretty profitable and repeatable business,
and especially given the limited downside risk (mostly carried by the
students, who've already paid).
~~~
shalmanese
I've had a look at the books at a few different types of training style
companies and the economics are always grim. It's one of those businesses
where to the outsider, it's impossible to believe they're not spinning off
unbelievable amounts of cash but they are always pretty marginal.
As additional evidence for this, the number of 1B+ exits in this space can be
counted on one hand. General Assembly, for example, was acquired for ~$400M
and it was one of the largest players. There was a player from Utah whose name
eludes me now that had a, I think, ~$2B exit but that might be the only one.
~~~
dathanb82
> There was a player from Utah whose name eludes me now that had a, I think,
> ~$2B exit but that might be the only one.
Are you thinking of Instructure? They're based in Salt Lake City, and
Marketwatch gives their current market cap as $1.86B
~~~
shalmanese
Yeah, I'm pretty sure that was the one, thanks!
------
raiyu
I've had a fair amount of experience with coding boot camps having have helped
several friends and coworkers apply, get accepted, attend, and complete
bootcamps across various cities and institutions.
The idea of bootcamps is fantastic, they allow people that may have not been
exposed to computer science to get up to speed on technology and transition
careers.
However, there are several problems. First and foremost, while 3-6-9 months
are great when you are going from zero knowledge, the challenge is that isn't
enough time to really be a junior developer unless you have prior experience.
So you will need to continue to augment your education after graduation to
ensure that you get a well paying job.
Most people attending bootcamps are doing so after college and later in life,
which means even though some bootcamps are cheaper than attending a semester
or a full year of college they are still quite expensive because the students
are not "students" in that they are usually adults and need to figure out how
to pay for school, attend classes, while effectively receiving zero income.
And applying for loans is much more complex because this isn't the same as
taking out student debt for college.
The second issue is that all of these bootcamps say that they are for
beginners with zero knowledge, but looking over their curriculums that simply
isn't the case. Software engineering has gotten much more complex over the
past decade. When I first started fumbling around with it myself I could just
write some PHP or Perl code, and get up and running quickly. Today you need to
know about github, javascript libraries, frameworks, and the typical "Hello
World" application isn't a direct route.
Most of these schools don't realize that their first students are already
exposed to these concepts, but later students aren't so they don't really
adjust their curriculum.
With lambda in particular it also is a bit confusing because the school is
online. Which should mean that you are able to provide the service for a lower
cost, but they are charging the same amount as in person physically attended
schools.
There are a lot of deceptive practices in the industry, lambda isn't alone,
such as taking recent grads and giving them low paying jobs as TA (teacher
assistants) so that they can provide help to students at a lower cost while
also allowing the school to claim that they have higher placement.
The curriculums are really dependent on a per school basis, but I have seen a
bunch of stuff that simply doesn't make sense and makes it more challenging
for students.
One school had students do a group project, which wouldn't count as part of
their final for the first "mod" of the school. However they continued to teach
things that you need for the final which would be a personal project. If you
got assigned to a bad team you would be working much slower and not able to
keep up and then your final which has nothing to do with a group determines
whether you proceed or repeat the course (they charge you to repeat). Also
it's important to note that 50% of the students didn't pass the first mod,
which means you have a 50% chance of being on a slow team that would hamper
your learning. Their advice was that it is important to learn to pair, and I
agree, but when you get to your first job you are pairing with people that
have experience, not where your partner has a 50% chance of failing out.
I firmly believe that bootcamps and providing secondary education choices are
essential and if done correctly can really begin to combat the monopoly that
colleges hold over education, but it's a challenging mission.
With education there are student loans that you can take out and I think that
is essential to get this going in the US because it is simply impossible for
most people to not have any income and still pay for schooling for even 6
months, much less a longer period of time.
The other challenge is that you really need to have 3 terms. Beginner,
intermediate, advanced. Each student can then apply based on skill set to
determine where they place and students can move from one to the next, with
each section being 4-5 months. If you did 15 months of education you would be
much better off than what the school provides. It's great to get from zero to
one in terms of knowledge, but students are still left far from having skills
that are immediately beneficial to employers.
There's definitely more work that needs to be done.
As for Lambda itself, when you look at how other coding bootcamps have fared
financially it doesn't paint a rosy picture. It's a challenging space to
operate and the VC style returns simply aren't there. If you want to offer an
online only education that is fantastic but you have places like codecademy
that do that and also do not charge you $30k for the privilege of basically
accessing information that is online for free.
The challenge with coding is that it really is much easier to get going when
you have someone you can ask questions from, so helping to improve that aspect
of it while providing it online at a low cost is really the challenge.
~~~
hintymad
Just curious, what's wrong with a community college? It's cheap. It's
flexible. Its admission rate is practically 100%. Its courses are not worse
than a code camp's. My relative went to a CC, and I reviewed his course work.
CCs does not spend much time teaching all the fundamentals, but they do teach
some. In their data structure course, they don't teach student why two pivots
are not better than a single pivot in quicksort nor do they cover discrete
probability or classic complexity analysis extensively, but they do teach (and
practice!) asymptotic complexity and why vanilla quick-sort may perform badly.
They don't teach students how to prove the boundary conditions of ODE, but
they do teach intuition and how to solve and apply a wide variety of ODEs. The
examples can go on. They also hire teachers from industry to teach courses on
data processing, frontend engineering, and etc.
With the belief that education is all about laying solid foundation for life-
long learning, for job or not, I don't really see any need for coding camps.
I'm not denying there are success stories, but I don't see coding camps make
statistical sense.
~~~
raiyu
I think community colleges are great, and if you can get one on one
instruction anywhere it is fantastic. The challenge is that bootcamps are
really designed for people who are changing careers. Which means they have
already gone through college or actively working. So it becomes an issue
because they are reliant on income to survive and they aren't living with
their parents.
Basically it's unplanned and so much harder to able to commit two years.
Certainly do-able, but challenging when you are thinking of it from a consumer
perspective. Spend two years working towards a career shift or get it done in
6 months.
Maybe community colleges can do a better job of marketing themselves.
But ultimately I think the fact that computer science isn't a requirement in
all education is criminal. We study "Math" and "English" in school. "Computer
Science" is the equivalent of math 100 years ago, it needs to be a mandated
requirement.
~~~
superduperuser
"" Computer Science" is the equivalent of math 100 years ago..."
Would mind expanding on that?
~~~
raiyu
You can't live in the modern world today without understanding math.
You also can't live in the modern world without language and the ability to
communicate.
To me math is a language. Different than our verbal languages, but it is still
a language none the less and essential.
Programming to me is also a language. And it is as essential today as math was
originally. There was a long period of history where lower economic classes
were prevented from learning and educating themselves and the fact that we
have education that is subsidized by the government to various levels in all
major countries is something that we take for granted, but it wasn't a right
that our ancestors had.
The way schools force all kids to learn math, I feel is how schools today
should force all kids to learn programming and computer science.
------
kostyal
I'm a current student at Lambda School. It's a pretty stressful time at the
moment - I don't have any loyalty to Lambda, but the recent string of damaging
stories about the quality of teaching and average graduates is concerning.
It's true that Lambda is incredibly disorganised and the build weeks etc are
chaotic. Equally true that they don't do a good enough job of ensuring we have
something to show for ourselves on our portfolio.
It's also true that their admission standards are seemingly incredibly lax.
About 40% of my cohort struggle to code at a fundamental level - I don't mean
that harshly, it's Lambda's fault
With that said, I've really enjoyed my time at Lambda overall and it saddens
me to see it fail like this. The atmosphere and internal culture that they
cultivated is second to none and I have enjoyed my time there a lot.
As with many people at Lambda, I joined them at a difficult time of my life,
when I was suffering from pretty severe depression. I knew I loved coding but
barely spent any time doing it and struggled with impostor syndrome, etc.
While at Lambda I benefitted hugely from the daily structure and discipline,
and from having a community of people in the same position as me. I've made
some great friends, and met some very smart and talented people.
What pains me is the embarrassment of appearing like some clueless fool who
got caught up in some get-rich-quick scheme. I love programming, and I just
wanted a structured curriculum to train as a professional.
~~~
_ah
> What pains me is the embarrassment of appearing like some clueless fool who
> got caught up in some get-rich-quick scheme. I love programming, and I just
> wanted a structured curriculum to train as a professional.
Don't let this stop you. The world needs more good engineers, and if you
practice your craft you will always find a home. There are plenty of industry
professionals who now look a bit silly for their choice of company (Uber,
WeWork) but ultimately it's all just a job and if you have the raw skills you
can find a new gig.
------
caust1c
Having friends in the program, I can tell you it's run like a circus.
Expecting 10 people who don't have any experience programming to cooperate on
a project without any support or oversight is just asking for student
failures.
Of course they'll say that the students have supports through their PMs, EMs
or TLs (depending on the mood, they change the role title), but they're never
available and miss meetings constantly. Also, they've reogranized the
curriculum multiple times during the tenure of my friends, and don't wait till
the next batch like a sane school would.
I feel really bad for the excellent teachers they brought on board. They ended
up with a lot more than they bargained for.
Half or more of the program is composed of the crappy group projects.
The people who succeed after lambda school is in spite of the program, not
because of it.
~~~
sriram_sun
I agree with almost everything you say. However all the college projects I was
part of, there was no supervision whatsoever. We were all children learning to
be adults. Some of us were already there before others. I'm guessing lambda
students are more adults than kids. Bottom line, TA support is golden.
Supervision, not so much.
~~~
seem_2211
This is a golden example of where in person beats online. It's so much easier
to weed out the people who will not show up when you do things in person (and
there are plenty of those in College). But even diligent people are more
likely to flake when things are online. There's simply less commitment.
------
hundt
> Whether or not this counts as “selling” strikes me as a meaningless semantic
> distinction: Either way, the school receives some money up front and an
> investor shoulders some of the risk of the ISA not paying out. And either
> way, Lambda School students don’t know that the school isn’t as incentive-
> aligned with them as the school’s marketing indicates.
It is certainly not meaningless! Selling an ISA means that Lambda no longer
has any financial interest in its outcome. Borrowing against an ISA is
completely different; if the ISA doesn't pay out then Lambda goes bankrupt,
which is precisely the incentive alignment they claim to have.
~~~
raiyu
It is selling, because the loan is backed by the ISA. Which means if Lambda
can not repay the loan, the loaning company now owns the ISA and can use any
sort of aggressive tactics to get the student to repay their ISA.
Just like if you take out a mortgage on a house and fail to pay the loan back,
then the bank owns the house.
In this case since the ISA is used as collateral, the company that originated
the loan now owns a lien on the ISA, effectively giving it ownership.
Similar to when you lease a car, there is a company that provides the finances
for the lease, and has a lien on the car, which means you do not own it.
Otherwise you could lease a car for $250/mo, then sell it the next day for
$30k, but you can't because there is a lien on the title.
So in this case, while the loan is outstanding, the originating loan company
effectively owns the asset used for collateral. Ownership means you have 100%
control over the asset, and in this case, Lambda has given away 100% control
over the ISA.
It also means that they are not aligned anymore, since they have received
financial compensation for the ISA up front, they can default on their
repayment of the loan as it doesn't matter because the collateral aren't
shares in their company, but just the ISA itself.
~~~
hundt
You seem to think that this "loan" that the investor makes to Lambda School
only has to be paid back when the ISAs pay out, and if they never pay out then
Lambda School's debt is just forgiven. That would indeed be similar to just
selling the ISAs to the investors, but I don't think that's what is described
in the article (as Lambda School's current practice). Rather the article says
that Lambda School gets a "loan that is secured by students' ISAs" which
implies that Lambda School has to pay it back with or without the income from
the ISAs.
~~~
raiyu
Secured by the is the part I focused on. When a loan is secured by something,
the underlying asset is what the loan originating company retains after you
fail to pay for the loan.
So if I take out a mortgage, and fail to pay for it, the bank gets the house,
I am not still obligated to pay the remaining loan amount. Sure there are a
few other items that occur in that process, but ultimately the loan is
forgiven, of course with some credit penalties and future ability to take out
loans. But I am not responsible to continue repaying the loan.
If the loan is secured by the ISA the same thing applies here. Sure there
could be other stipulations and without reviewing the contracts there is no
way to know, but stands to reason that the more likely situation is that they
get the ISA and then can use aggressive tactics to go after the students for
collections such as garnishing wages and also potentially charging them an
interest for failure to repay. Though the original ISA believe has no
interest, that's not to say that late penalties or other fees can't be added
in later potentially.
~~~
hundt
Ah, I think you have a misunderstanding about what collateral is. Collateral
is not generally a limitation on the lender's ability to collect the debt. In
some specific cases (like certain home loans in certain states) the law adds
this limitation but generally collateral is just something the lender can take
in the event of default.
To bring up the car loan example you mentioned earlier, if you decide after a
year that you don't want your car anymore you can't just drive it to your bank
and drop it off instead of paying off the rest of the loan. If you stop paying
it they can repossess it but you will still owe whatever is left on your loan
balance after they auction it.
------
akanet
Hi, I'm the author of the piece, perhaps better known here as founder of
CoderPad. Feel free to ask me anything but things are a bit crazy right now!
~~~
intherdfield
I don't think you mentioned that the student doesn't have to pay back Lambda
after 60 months of deferred payments. That seems important. From the Lambda
site:
"The income share agreement has no interest. It's a flat percentage that goes
away once you've reached the $30k payment cap, you've made 24 payments, or
after 60 months of deferred payments (even if you haven't paid us anything)."
But you wrote in the article:
"Students with no safety nets experience real financial pain from the nine-
month hiatus from work, in addition to the looming dread of possibly having to
pay Lambda $30K one day."
~~~
akanet
Yeah, got cut for brevity, but students definitely experience serious anxiety
about even the five year horizon.
~~~
intherdfield
That seems really disingenuous. Can't you update the article with this
important detail?
~~~
akanet
It's not disingenuous; I think an earlier draft had it. This came down from
4000 words. I'll try to ask my editor to add this as a correction but things
are pretty hectic. It's a fairly minor omission in my opinion, but I
understand disagreeing.
~~~
tomhoward
It's hardly a "fairly minor omission" in a country where the conventional
tertiary education saddles the average student with a debt that is often far
bigger than $30k, doesn't expire, and cannot be escaped even through
bankruptcy, meaning that even retirement benefits can be garnished above a
meagre $750/month.
------
soneca
I learned to code from scratch using freeCodeCamp.org, tutorials and official
documentation. Got a good job after 8 months studying full-time. Almost 3
years now and everything is good (more details on my experience here:
[https://rodrigohgpontes.github.io](https://rodrigohgpontes.github.io)).
As a totally outsider (not even from the US), Lambda School seemed too good to
be true, in a good way, and I said so here in HN. 9 months of serious
commitment, no upfront payment, ISA, all of it seemed good to me.
These days, I don't think the same way. Lambda School and it's founder seem
more worried about being a billion dollar company than doing the right thing
for its students. Personally, that's strategically stupid and counter-
productive. They won't be a billion dollar company because they don't care
about their students. But it looks like the direction they took. This mistake
is not inherent to VC ventured companies or Silicon Valley, so I think the
ones to blame are Lambda leaders, not the system.
So, I still recommend freeCodeCamp.org as the best thing that exists for
people that want to learn to code and get a job. I wish some of these
effective philanthropy multi millionaire would give them generous money to
pursue their mission.
It changed my life.
------
pembrook
While I find Lambda’s incessant twitter evangelizing as annoying as the next
guy, I’m not sure any article I’ve seen is painting a realistic picture about
Lambda.
Media outlets have incentives to either paint you as the second coming of
Christ or as Satan. It appears Lambda, for a while, actually succeeded at
convincing journalists they were the former.
After a while, people get bored with that though. The incentives that drive
clicks flip. Suddenly Lambda is now Satan. Burn it down! Downvote all
sympathizers!
Here’s the reality: all models for education can work for certain people in
certain instances. Lambda is definitely the best choice for some people. But
no single company is going to solve something like “education” or “healthcare”
because they are political institutions tied to the power dynamics that
determine how society is arranged. You cannot brute force this without gaining
influence over government itself.
This is not as simple as disrupting where people buy their shampoo or where
they see ads.
~~~
raiyu
There are always two sides to every story, but when you have outright
fraudulent claims I don't think you can say that the article is simply
painting the school as "Satan"
If you stated that you have an 86% placement record and in reality it is 50%,
that is a pretty large discrepancy. If the original 86% placement was from the
first 70-ish students and now you are over 2500 students, that seems a bit
fraudulent.
If placement rates aren't critical to you getting students, then you can say
it is 50% publicly and see if that affects your enrollment numbers or not.
Otherwise, it would stand to reason that stating a higher placement rate gets
you more students.
Also, this isn't run a a non-profit organization, its a for-profit enterprise.
So they have a financial incentive to get more students because that equates
to more value for them.
~~~
pembrook
Cherry picking a strong cohort and using it to create a narrative is the same
thing as cherry picking a weak cohort and doing the opposite.
This is my point. Journalists are picking a narrative first, then seeking out
facts to justify that position.
When they thought lambda was going to fix education they were more than happy
to report the 86% number without any research. Now that lambda is “evil” they
look for the lowest number they can find.
I’m sure the truth is somewhere in the middle.
~~~
raiyu
The other wasn't a weak cohort. It was the most recent data.
Also if you were an institution interested in disclosing information fairly
you would simply list all cohorts and let the consumer decide.
But they aren't doing that and instead claim a single number.
By the way, having had recent experiences with a number of bootcamps through
friends I would say that the 50% number anecdotally is accurate. That's also
not taking in to account the drop out rate. If you look at people starting the
bootcamp to placement it would be even lower.
If you have 465,000 customers and you want to round that up to 500k - ok, not
exactly true, but whether you have 465k or 500k isn't going to change my
decision about using your product.
Telling me you place 80% of your students and actually placing 50% is a big
deal. 80% means I have a 4 out of 5 chance of being successful. 50% means I
have a 1 out of 2 chance of being successful. Those odds are very different
and certainly part of the marketing push to get people to sign up.
------
lquist
Doesn't surprise me. Culture starts at the top. Austen has repeatedly proven
to be a liar (this article has a few examples and you can find others in
Verge's reporting) and that culture has been normalized at Lambda.
------
brenden2
From seeing this guy's (Allred's) tweets pop up on twitter a few time, I
always got the feeling that he was a scammer or con artist. This article
provides a lot of evidence that seems to confirm my hunch.
I've seen people have success in startups using similar tactics throughout my
career, and one realization I've had about this is that sometimes perception
matters more than the actual numbers. It's especially true when it comes to
investing and VC fueled businesses where success tends to follow the funding
(i.e., if you get enough money you can buy your way to success).
Eventually, however, everything comes out in the wash. At some point lying
about the numbers won't work anymore, and maybe for Allred this is an example
of that. As a founder, if you don't eventually deliver on what you've
promised, it will all unwind and you'll be left with nothing.
~~~
seem_2211
Taking a more benign view, I don't know if Austen is a scammer, but he seems
like a smart guy, who's a bit over his head, with a smart team, who are in
over their heads. The kind of students they've targeted need more help, not
less, which makes this a bit of a messy situation.
------
danso
The author also posted a 50-minute audio clip of his interview with Austen
Allred:
[https://twitter.com/fulligin/status/1230162120701964289](https://twitter.com/fulligin/status/1230162120701964289)
~~~
Fede_V
This is exceptionally good. Fantastic journalism.
------
hundt
Interestingly, Lambda School has just announced a new "ISA Financing
Blueprint" [0] and "Better Data Transparency" [1] which address some of the
concerns from the article.
[0] [https://lambdaschool.com/the-commons/announcing-our-new-
isa-...](https://lambdaschool.com/the-commons/announcing-our-new-isa-
financing-blueprint-and-100m-in-new-financing)
[1] [https://lambdaschool.com/the-commons/building-better-data-
tr...](https://lambdaschool.com/the-commons/building-better-data-transparency-
at-lambda-school)
------
wbharding
Mildly interesting how this story has gotten 130 points in two hours but is
still placed beneath numerous other HN front page stories with less points in
more time. Not least since the incentives of YC would be to want this story
dead. In the 5 mins it took me to login on mobile it dropped from 15 to 25.
~~~
dang
The submission hit some software penalties, like HN's flamewar detector.
Moderators didn't touch it or even see it until a few minutes ago (I'm getting
a late start this "morning"), when I saw it and turned off that penalty. This
placed the article back on the HN front page.
We moderate HN less, not more, when YC or YC-funded startups are in a story.
That's literally the first principle of HN moderation, in the sense that it's
the first thing PG told me when he was training me to do this in 2012. I
didn't even have time to grab a chair before he blurted it out.
It's natural for people to question this, so such questions will always come
up. We're scrupulous about following this rule for two reasons. The first is
that we need to be able to answer the questions in good conscience; the second
is that moderating any other way would be dumb. The community's trust is the
most valuable asset HN has. In fact, if you think about it, it's the only
asset HN has.
[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20moderate%20less%20not%20more%20yc&sort=byDate&type=comment)
------
DantesKite
It’s interesting how learning to code is so challenging. Especially because
it’s something humans make. Over the past few years, coders have been
abstracting the complexity away, but man there’s a lot of it.
On a more meta level, it feels like we’ve barely touched the margins of how
humans learn best. That’s always struck me as odd.
Like nobody has quite cracked the nut for how to do it. Not with programming,
not with language, not with cooking. It’s always so much effort. As if the
lessons people have learned are forgotten as soon as a few years pass.
We know a little bit. Spaced repetition helps. Not doing anything after
studying intensely helps (as in literally just doing nothing after studying
improves retention).
Going for long walks helps. So does transforming concepts into something you
can see (I suspect because we have a lot of “machinery” for visual
computations).
Engaging more of your muscles during an activity (writing as opposed to
typing) helps too.
Engaging extra senses (like smell) seems to help.
But that’s about it. There’s no common modality, no library of how humans
learn, retain, and transfer knowledge.
It’s been like what? 200,000 years since we’ve been on this planet.
We’ve barely touched the surface of what we’re capable of doing because we
keep forgetting.
~~~
jacques_chester
> _On a more meta level, it feels like we’ve barely touched the margins of how
> humans learn best. That’s always struck me as odd._
It's very well studied. I'd start by reading a cognitive science textbook and
go from there. I have _Cognition_ by Reisberg[0], it was well-written and
well-structured.
Alternatively, an entertaining and informative popular account is _Make It
Stick_ by Brown, Roediger and McDaniel[1].
I do agree that the research findings are greatly underapplied, though.
[0] [https://www.amazon.com/Cognition-Exploring-Science-Mind-
Seve...](https://www.amazon.com/Cognition-Exploring-Science-Mind-
Seventh/dp/0393624137/)
[1] [https://www.amazon.com/Make-Stick-Science-Successful-
Learnin...](https://www.amazon.com/Make-Stick-Science-Successful-
Learning/dp/0674729013)
~~~
DantesKite
I enjoyed the second book recommendation. Looks promising.
Thank you.
------
codegeek
I could be wrong but isn't Lamdba School a perfect example of destroying a
noble idea due to the venture capitalish returns expected from the company ?
If there wasn't so much pressure to grow that fast, perhaps they would have
done a better job focussing on the quality and outcomes instead of growth at
any cost ?
~~~
caust1c
Reading history about the founder, I think he had his eyes set on massive
growth from the outset over the actual experience.
He got starry-eyed working in growth at LendUp and wanted to build his own
rocket ship. I don't think he ever put the students above running a trendy
fast-growing startup.
------
kregasaurusrex
Since the original HN post[0] is about 2.5 years old now, one would expect
there to be more data released by LS regarding job placement/salaries before a
potential student would sign a $30,000 ISA. I've been a fan of the idea of up-
skilling at a bootcamp for those who either don't want to spend 4 years
getting a CS degree or being unable to finance its associated costs which can
be in excess of $75,000. In an increasingly tight labor market, there's fewer
jobs available in the 'middle' where companies seem to be either wanting an
experienced SME in a specific domain or less-experienced programmers to do
testing rather than development.
I think the bundling and selling of student CDOs works for a company in its
early stages, since the costs for recovering unpaid tuition can range from
high to being declared a total loss. Some of the anecdotes make it seem like
the company doesn't receive enough feedback from employers in order to
strengthen parts of their curriculum that make interviewees fail at some
stages during the interview process. Creating a complete curriculum takes more
than bundling together Leetcode and Hackerrank problems, where the actual
teaching reflects that bootcamps and a CS education aren't in direct parity
with one another. [1]
Regarding the finances, student loan debt through an established institution
can't be removed through filing for bankruptcy. This allows access to federal
loans and underwriters that give better rates and agreements for all the
business parties invloved. I've been denied refinancing my loans from multiple
companies for a combination of those effects including not graduating and
having too low of an income. The business seems like a good idea but too
fixated on higher short-term valuations in order to attract VC capital; rather
than fixing its underlying structural problems.
[0]
[https://news.ycombinator.com/item?id=15011033](https://news.ycombinator.com/item?id=15011033)
[1] [https://hackernoon.com/bootcamps-vs-
college-11dd76a4d127](https://hackernoon.com/bootcamps-vs-
college-11dd76a4d127)
------
MivLives
Lambda grad with a job, who also worked as a PM/TL.
Overall I think they're trying.
The problem with the PM/TL program is the people who are normally the best at
it tend to be the best at finding jobs. The TLs that stick around forever tend
to be ones that aren't the best or the worst who are normally replaced when
someone complains. Instead it's the okay people who are just cruising.
One thing that seems to be overlooked about Lambda is the fact that they are
remote. An applicant from the middle of nowhere who lives hours from the
nearest city has as much chance of getting in as anyone. It can be harder to
find a job there and they won't relocate. Remote work is much harder to get in
as a junior dev.
Overall though, I wouldn't be in the position I'm currently in without them
though. Maybe I just made it in before the line, maybe they're just undergoing
growing pains as expand to a profitable size.
------
mikekij
I don't have an informed opinion on Lambda School, but it seems as though most
of the concerns raised in this article are true of universities as well, and
in some cases, worse in universities:
Lambda makes money through some financial engineering : Ivy league schools are
essentially hedge funds with their multibillion-dollar endowments
Lambda's curriculum is lacking : Most CS students are taught theory, but learn
very few of the skills they'll actually need on the job ( e.g. application
deployment, source code management, etc.)
Lambda inflates its job placement stats : This is never defensible, but also
highly prevalent in both non-profit and for-profit higher education
institutions.
I commend Austen for at least attempting to better align students' and
educators' financial incentives.
~~~
sjc33
Lambda School didn't invent the ISA.
------
xwowsersx
I co-started LS before it was in YC. Here's the original course I taught
[https://www.youtube.com/watch?v=XnUp9BNCQZM](https://www.youtube.com/watch?v=XnUp9BNCQZM)
Weird seeing stories about it now.
~~~
mattcdrake
Are the other videos in this series publicly available? You are great at
explaining things.
~~~
xwowsersx
Thanks! I'm not sure where the other videos are (I don't see them on my
Youtube). I'll see if I can dig them up.
------
fronofro
There are inherent limitations to edu that go unacknowledged.
\- There are a relatively fixed number of high skill jobs.
\- More people with the same skills/portfolio reduces the value of that
skill/portfolio.
\- While people can learn new skills, people have different learning rates and
integrating that knowledge deeply often takes much longer than
teachers/bootcamps want to believe.
\- Most edu info exists for free online, what school/bootcamps uniquely
provide is feedback. Quality feedback is HARD to scale.
\- How much does pre-application screening find people who don't really need
that much help vs trying to help every applicant and getting unmotivated
people in your bootcamp. I have seen very few examples of a bootcamp
fundamentally changing someones motivation/work ethic.
ISAs will be the big winner in the ed tech space and providers of education
will likely come and go unless someone cracks all the issues above. Perhaps
the current distribution of students to universities (having plurality) is
actually more suited to serving this function than a typical tech world power
law distribution where one or two companies would provide 80% of the edu.
------
_hardwaregeek
The more I think about bootcamps, the more they appear to be a rather
insulting business proposition. Did they genuinely believe that one can take
someone off the street, albeit someone relatively intelligent and motivated,
and teach them enough to be a professional programmer in 9 months? Imagine
that we replaced programmer with mechanical engineer. Or with marketing
executive. Would that sound believable?
What is it about programming that makes people believe that they can learn it
in a hilariously short amount of time? Sure, we programmers may be harboring
some imposter syndrome and secretly believe that programming is super easy.
Sure, programming doesn't require lots of math. But it's still hard! It's
still a craft that requires problem solving ability, lots of semi-arcane
knowledge and a detail oriented mind.
~~~
sean2
I don't think we could pump out the well rounded developers like 4 years of
undergrad can, but I don't see how we couldn't pump out SQL technicians,
front-end developers, or otherwise specialized workers that could alleviate
the supposed shortage of developers.
I also helped a person go from zero knowledge of programming to pumping out
android games in a month of self study. So I know that smart, motivated people
can get pretty far pretty fast. In regards to your point, I don't think this
person knows UML or ever learned big O notation, but this person was able to
gain the skills required to do what they wanted, plus some pro-programmer
friends and stackoverflow to fill in the "semi-arcane knowledge" when
required.
~~~
_hardwaregeek
I think we can pump out good _apprentices_. After 9 months of study, the
person could definitely have enough skills to write semi-valid code and work
on some small projects with supervision from a developer. However I would not
call this person after 9 months a developer. A developer should be able to
look at a codebase, learn it, learn the relevant technologies and start
contributing. I doubt a bootcamp graduate will be able to do that. Or worse,
they'll do that and wreak havoc on the codebase.
> front-end developers
I suspect this is part of the issue. Plenty of people believe that front-end
is easier or lesser than back-end, and therefore can be learned quickly. Lots
and lots of terrible front-end code has taught me that this is not true. Good
front-end developers are really hard to find and subpar ones can lead to
horrible user interfaces and direct impact to the bottom line. Even if we're
talking the most minimal of front-end stacks, i.e. HTML, CSS and JS, which,
I'm not even sure people hire for anyways, there's a lot of subtle issues with
accessibility, responsive UI, writing halfway decent JS, etc. If we add on the
various libraries (a sign of a good developer is also knowing when to use
these libs and when to avoid them), then the amount of knowledge required is
far far more than 9 months can provide.
------
taytus
Austen has totally disappeared from twitter. He used to tweet all the time
about their _AMAZING_ success rate.
~~~
prawn
He was also quick in the past to post about LS on HN but no comment in this
thread at all. Might be responding in blog form and outlining a few changes
they're making to address the concerns?
------
rajacombinator
There are plenty of red flags in the article but the part about Lambda selling
the ISAs removing their incentive is BS - their long term incentives are
clearly still aligned because they won’t be able to continue selling ISAs if
they don’t deliver. Selling the ISAs presumably just helps with cash flow.
That said it makes one wonder why a company that’s raised 9 figures needs to
worry about cash flow.
------
abbadadda
This anecdote is particularly damning for Lambda School, like a hedge fund
choosing not to update their investment return numbers year after year because
they were so great in year 1:
> So where does that 86 percent figure come from? Lambda has reported
> graduate-outcome statistics at the Council on Integrity in Results Reporting
> (CIRR), a voluntary trade organization of coding boot camps whose purpose is
> to ensure that participating schools publish truthful information about
> student outcomes. Allred has often used this report to defend his company
> online. But where other boot camps have multiple reports spanning many
> student cohorts, Lambda has only reported statistics for its first 71
> graduates — 86 percent of who, the school claims, found jobs. Sheree
> Speakman, the CEO of CIRR, told me that Lambda has not undergone the
> standard independent auditing for the sole report it has submitted, and that
> her communications to Lambda School regarding further reporting and auditing
> have gone unanswered.
------
hprotagonist
relatedly, it's so nice to see zed shaw tilting at a windmill that deserves it
for once:
[https://twitter.com/lzsthw/status/1212284566431576069?lang=e...](https://twitter.com/lzsthw/status/1212284566431576069?lang=en)
~~~
azangru
"If you are contemplating joining a coding bootcamp in 2020...", his tweet
starts.
I wonder whether the quality of education at coding bootcamps has gone
downhill. From what I know of the Web Development Immersive program at General
Assembly in 2014, or the software engineer program at Hack Reactor in 2015,
they both were very decent.
------
manfredo
The lambda school struck me as odd due to its very long 18 month duration.
Most successful bootcamps I've seen have been focused on 2-4 month programs.
They also try to get students that have already taught themselves some coding
(at least be able to code up a hang-man game in a terminal), and focus on
teaching effective abstraction and how to use git. Basically, they don't seek
to take people who don't know how to code and turn out engineers more than a
year later. They take people who know how to code, and refine their skills to
the point that they are employable.
~~~
MivLives
18 months is the part time. 9 is the full time. It used to be six, but they
raised it because people would expect a job on graduation, nine is more
realistic for that.
------
MarkMc
All the haters seem to be piling onto Lambda and its CEO, but looking at
actual reviews by students it is clear that most students are very happy with
Lambda: [https://www.switchup.org/bootcamps/lambda-
school](https://www.switchup.org/bootcamps/lambda-school)
And the problems mentioned in the article are straightforward to fix:
(a) Be honest and transparent about the job placement rate for each batch
(b) Don't sell the ISA or borrow against it
(c) Improve the quality of teaching and accept a lower growth rate
~~~
Step888
Take a look at the other coding bootcamps on that site. They all seem to have
very positive reviews.
The problem is that switchup uses the affiliate links of the bootcamps, so it
has a financial incentive to sell the bootcamps it features to its viewers.
That's how it makes its money.
Now Amazon does the same thing, you'll say. And you'd be right, but with
Amazon, it has a reputation to maintain. If Amazon reviews are unreliable,
people will stop shopping at Amazon.
With Switchup, it's different. Most people will only buy one coding bootcamp
in their lifetime. So withchup doesn't have the same incentive to make its
reviews super accurate. It mostly has the incentive to sell as many coding
bootcamps as it can.
------
nkrisc
About these ISAs: do they require any kind of good faith effort to find a tech
job after completing the boot camp? If someone was just interested in tech,
but worked in an unrelated field with no intention to switch, how would Lambda
protect themselves? Obviously they could probably sue if there was bad faith
on the student's part, but do these agreements address this scenario? Do they
focus on screening students to avoid this?
~~~
hundt
My understanding is that Lambda school is very focused on teaching you what
you need to get a job, so if you were just "interested in tech" you would
probably be better served by Udacity, other free or nearly-free online
courses, or just reading books.
~~~
nkrisc
Right, there are certainly better options. I'm not necessarily interested from
a practical standpoint, more of a theoretical one, being relatively unfamiliar
with how these agreements work beyond a one-sentence description.
~~~
hundt
You can read what Lambda School describes as the "template" for their ISA
contracts:
[https://leif.org/api/products/5b5b8bd0e59b743f9a086ed9/pdf](https://leif.org/api/products/5b5b8bd0e59b743f9a086ed9/pdf)
It says that you agree that you "are entering into this Agreement in good
faith and with the intention to pay us" and will "make reasonable and good
faith efforts to seek employment" as long as you are not paying them.
I don't see any specific provisions describing how that would be enforced and
I bet in practice it is not a major issue for the reason I mentioned above:
there isn't much reason to go through with Lambda School if you don't actually
want a job in tech, so it probably doesn't happen much.
------
AVTizzle
On the authors Twitter, he comes across as having an axe to grind:
"lmao I woke my boy up" [1]
Sounds like he's keeping score of something, which isn't a quality I
necessarily want from my "journalists".
[1]
[https://twitter.com/fulligin/status/1230251533951889409](https://twitter.com/fulligin/status/1230251533951889409)
------
Dansvidania
Is this the same, or does it stem from, Lambda School that started with a
kickstarter in 2016?
------
nimbius
>Lambda School is free, but with an asterisk: To attend, you sign a contract
that says that if you get a tech job paying $50K or more, you have to pay 17
percent of your pre-tax income to Lambda School for two years, or until you
pay back $30K, whichever comes first.
30k? Seriously? my trade tech certification was only $2900 for _two years_. Is
Lambda School a 4 year program or a college of some sort? Is room and board of
some kind included? I mean I get that STEM pays a lot...but the cost here
seems a little steep.
There are only 4 STEM related courses on the site. my trade-tech school had
nearly 20 by the time I graduated.
~~~
conanbatt
Was it an 8 hour a day 5 days a week program for two years?
------
ChicagoDave
The problem with Lambda School is the model and vision that suggests a general
college education isn't worthwhile. That college should focus on a marketable
skill directly leading to some job. That's optimal, but that leaves out the
social aspects, the critical thinking aspects, and the safe space to have an
open mind and absorb and share ideas.
I would argue those other aspects of higher education are far more critical
than the bootcamp model and vision.
And eventually, it boils down to a waste of time for some subset of students,
which is not helpful to our higher education interests.
------
sjc33
I found his constant Twitter boasting distasteful and cringeworthy. It always
came off like an infomercial.
Coding bootcamps are not all bad, though, it's just that I think if you are
going to do one then 1) you really need a solid financial cushion (either
parental or own savings) and 2) you should really do a top rated in person
one. I can't imagine doing this through a Zoom for 9 months in my apartment
alone. The value of having social support you get from instructors and being
around other students in the same boat as you can't be understated.
------
chrisyeh
Lambda's marketing is deceptive and should rightly be stopped. But what is so
bad about a bootcamp with no up front cost that gets 50% of its graduates a
job? [https://chrisyeh.com/2020/02/the-cost-of-
cynicism.html](https://chrisyeh.com/2020/02/the-cost-of-cynicism.html)
------
737min
Found this blog from last year comparing Lambda to AppAcademy, confirms many
of the details and adds some.. no involvement with them.
[https://blog.appacademy.io/app-academy-versus-lambda-
school-...](https://blog.appacademy.io/app-academy-versus-lambda-school-which-
one-is-better/)
------
throaway1990
Fascinating story. I think there is value in the service provided by Lambda
School but the questionable revenue tactics like selling ISAs or using ISAs to
secure loans means students can't really trust them with placements.
------
737min
Good to see this finally get called out. Basically Lambda is using CDO’s &
subprime mortgages as a revenue source.. Even if it hekps some of the students
some of the time, we know how this show ends.
------
cm2012
I mean, it's still a vastly, vastly better deal than college or other boot
camps. It has no up front monetary investment, and you only pay for it if i
works for you.
~~~
dehrmann
> it's still a vastly, vastly better deal than college or other boot camps
I'd like to see ten-year outcomes before claiming this. That said, it's also
hard to control for different student backgrounds.
~~~
chasing
I would be more interested in the ten-year outcomes of something like the
Lambda School versus self-teaching or using cheap online resources and
entering the job market. Either way you're going to have to start with a very
junior and entry-level position in order to crack the job market. But at least
you're not on the hook for a huge chunk of your paycheck.
------
Grustaf
As a working developer you will spend a sizeable portion of your time learning
new things, so if you are not good at teaching yourself programming, maybe
it's not a suitable career choice? Especially with amazing MOOCs like
Stanford's Swift course?
At least in the context of something like Lambda school, that does not teach
computer science, as I understand it.
------
heedlessly3
There's no point to doing coding bootcamps. I know plenty of individuals who
were non-CS majors, not even STEM, switch to software developers after self-
study and building up a portfolio.
if you decide you want an accredited university degree, then you can get FAFSA
to fund community college, then transfer an online university such as Florida
~~~
conanbatt
Strongly disagree with this. I know 50 people that think they can learn to do
Data or something with an Udemy class and end up doing nothing.
I think the negative picture painted by the author masks that the company is
placing 50% of graduates which mostly _paid nothing_ for the education they
received. Lambda is not high quality education, but its super cheap and higher
quality than just reading a Knuth Book, or Skenna's algo book which has no
completion rate.
~~~
heedlessly3
Sounds like you know unsuccessful people. I know plenty of developers who
became front-end or dev ops from their own studies. It's not too hard to learn
SQL or data visualization. The majority of questions can be answered via
stackoverflow or slack communities. You don't need to know everything to land
a entry-level job, you just need to prove you know how to problem solve.
Either way, you are not becoming a true machine learning researcher via self-
study nor a bootcamp. You will need at least a master's in CS/Math/Stats to
land those roles. A bootcamp is neither the lowest cost to a softwaredev
career nor is it the a career effective maximizing choice.
------
BlueTemplar
This seems more like job training than education - something that should be
directly done by the companies that end up recruiting these people instead.
------
surferbayarea
A skill of knowing how to use React in a particular way to create simple views
is something that will be redundant in a few years. These people will
unfortunately be out of jobs. The mundane work of converting sketch files to
react views will be automated soon
------
jariel
These courses should be free.
If government agencies really had their acts together, then these courses, as
well as many others would be free, as part of 'extended learning' programs.
For all the crazy amount of grant money being thrown around by so many
governments, and given the need for certain disciplines, it's odd that this
doesn't happen.
------
neilk
I had a sinking feeling when the inventor of the CDO appeared in this article.
The housing bubble occurred, in part, because there was optimism that
acquiring a certain tangible asset - a house - was a sure path to prosperity
for the buyer. And the buyers of mortgage-backed securities had decades of
data showing that these were solid investments.
What about intangible assets like knowing how to code? Is this what Lambda
School is really doing?
And like the housing bubble, once the originator can sell the debt, they have
some incentives to inflate expectations on both sides.
Perhaps Lambda School’s ISA-selling isn’t vulnerable to this flaw. Maybe the
ISA-buyers are doing due diligence. But you would have thought the same was
true in 2008 too.
Like everyone with a Twitter account I’ve interacted with Austen. And I had a
mildly positive impression, or at least I thought he deserved more credit. Not
sure about that any more. I would like to see his response.
------
ucha
I'm wondering why it is that two hit pieces came out at the same time [1]? Is
it just a coincidence?
[https://www.theverge.com/2020/2/11/21131848/lambda-school-
co...](https://www.theverge.com/2020/2/11/21131848/lambda-school-coding-
bootcamp-isa-tuition-cost-free)
~~~
azangru
I've seen discontent with Lambda brewing on twitters for quite some time:
[https://twitter.com/KeziyahL/status/1155154616281178114](https://twitter.com/KeziyahL/status/1155154616281178114)
~~~
SamReidHughes
Any program teaching CS or software with relatively open admissions will have
its discontented students, because there are simply people who don't have the
cognitive ability to handle it, and their inability to do the homework is
always the teacher's fault.
| {
"pile_set_name": "HackerNews"
} |
How Europe is totally owning our in-flight electronics policy, again - Libertatea
http://www.washingtonpost.com/blogs/the-switch/wp/2013/11/14/how-europe-is-totally-owning-our-in-flight-electronics-policy-again/
======
selectodude
Wait, how is a flat rate internet service using WiFi worse than allowing
airlines to implement some sort of LTE antenna booster on every plane so I
have the opportunity to have no idea what I'm about to pay to use my phone?
What a horrendous article.
~~~
JonFish85
"Americans are feeling pretty smug after winning the right to use portable
electronics on airplanes during takeoff and landing. In fact, we even beat
Europe to the new rules by several weeks."
WTF? That's how you start an article? I haven't seen anyone feeling smug at
all about this ruling. Talk about trying to manufacture an issue!
~~~
mhurron
It's very important now to frame everything as a 'Us v. Them'. No one cares
about anything unless they are on the winning team.
------
georgecmu
I would be perfectly happy with in-flight phone calls being expressly
prohibited without any technical justification. Can you imagine being forced
to spend 6 or more hours next to someone with a bad case of glossolalia?
~~~
nlh
I have a feeling that enough people agree with you on this that regardless of
the legal / regulatory situation, a social framework is going to emerge that
pretty strongly discourages talking on the phone on a plane.
Even today, if you talk for too long or too loudly on an Amtrak train or a
public bus, you'll get some pretty strong death stares from other passengers.
Many will come right out and tell you to shut up.
If the rules fail us, peer pressure will step up :)
------
wjoe
I'm not really sure how 3G/LTE is better than WiFi, at least in technology
terms.
I've had WiFi on one flight in Europe (London to Oslo with Norwegian
Airlines), it was free to use for everyone on the flight. It was very slow and
unreliable though, and I spent half of the hour long flight just trying to get
things to load, unsure if certain ports/sites were blocked or if it was just
being slow.
That said, this problem is presumably from the connection between the WiFi
access point on the plane and the satellites. I don't really see how using a
3G/LTE on the plane will be any better, if it's still relying on the
(presumably small, low powered) equipment on the plane and a satellite
connection.
So I'm not convinced that this is an improvement over in flight WiFi, besides
a way for carriers to extract more roaming data charges from us. That said,
we've also got better laws over roaming data charges within Europe coming in
over the next few years too.
~~~
Uberphallus
Nice-London flight here, it was slow and unreliable too. Instant messaging
worked reasonably better than checking websites, but upload speed was rather
slow (1 min to share a picture in WhatsApp).
Still, it's an amazing service for free, the one you expect to pay extra for.
------
smileysteve
Wifi on flights uses less battery, is likely faster, is a better fit for the
small environments of flights, and because it bypasses the carrier routing
(and business politics) should be faster and possibly more secure.
Im perfectly okay with having wifi instead of 4g.
------
kjf
So I guess leet speak has evolved to such a place where traditional
publications like the Washington Post are using it sincerely in their
headlines.
------
Theodores
What would be fun is if planes formed their own 'Iridium' style network,
communicating with the planes in front and behind them on their flight
corridor, with airports as base stations. Only data need be carried (no voice)
as who really wants people yapping away on a plane? The transmitter on the
plane would need to assume a plane roughly 100 miles or so in front, it would
be roughly pointing in the right direction so the inverse square would apply
to that, rather than the deal with the satellite where the 'bird' is 20000 or
so miles away, at some angle above the equator, that has to be pointed at with
some complicated servo arrangement. The TTL should be improved with an ad-hoc
above the clouds network too.
| {
"pile_set_name": "HackerNews"
} |
Genetically modified salmons approved by the FDA - jaequery
http://www.newscientist.com/article/dn23035-approval-for-genemodified-salmon-spawns-controversy.html
======
jeffool
As someone who doesn't care at all about eating genetically modified foods, I
expect the argument over labeling will break out soon. To fire the first shot:
I really wish they'd label genetically modified foods. It seems a small
courtesy for those who do have concerns. And those who don't? Well, we don't
care. That's the point.
~~~
DanBC
You are rational. Many people are not. Labelling a food as "genetically
modified" is the same to them as labelling it as "radioactive" or "contains
excrement".
I'm not sure if labelling food as genetically modified is a useful warning, or
if it would be the same as saying "wifi used in this building". WIFI isn't
harmful, and eventually it's going to be everywhere and not avoidable. Yet you
still have people who claim that electronic smog causes all sorts of things.
I'm happy to eat these things. I'm gently worried about releasing organisms
into the wild with "exogenes" (or whatever they're called). Evolution is
amazing and powerful and wonderful. And human intervention in eco-systems
isn't filled with particularly great examples - a long list of invasive
species comes to mind.
~~~
malandrew
How many buildings in California do you regularly come across with the "This
building contains chemicals known in the State of California to cause cancer"
or something to that effect? They are everywhere. It doesn't stop 99% of
people from entering these buildings to work or live in them.
------
earbitscom
Other animals bred to have fast-growing qualities typically suffer incredibly
painful problems supporting their unnatural weight. When are people going to
realize that:
A) just because you can do something doesn't mean you should.
B) animals are not playthings put on Earth for us to do whatever we want with.
~~~
unimpressive
We can eventually fix the pain thing if that's an issue.
And Christianity takes an opposing view, meaning that this attitude is deeply
ingrained in western culture. (Which I happen to agree with.[0])
"01:001:026 And God said, Let us make man in our image, after our likeness:
and let them have dominion over the fish of the sea, and over the fowl of the
air, and over the cattle, and over all the earth, and over every creeping
thing that creepeth upon the earth." - King James Bible
Source: <http://www.gutenberg.org/cache/epub/30/pg30.txt>
[0]: The attitude, on this particular issue.
~~~
DanBC
Let's avoid the vegetarian arguments for the moment, because they are usually
more heat than light.
Are you saying that we can do whatever we like to animals?
Or that we can do whatever we like to animals so long as it benefits humans?
Or that we can do whatever we like so long as the benefit is significant?
Or what?
~~~
unimpressive
Okay.
>Are you saying that we can do whatever we like to animals?
Technically they can't stop us. But that's probably not the answer you're
looking for.
Inflicting unnecessary harm on semi-sentient life forms that feel pain is
probably immoral. So while we _can_ do whatever we want, certain things are
not _moral_ to do.
> Or that we can do whatever we like to animals so long as it benefits humans?
Right now current conservation efforts have the irreplaceable property of
extinct animals as an axiom. This is, for the moment, true. So we have to
measure decisions in something akin to utils. Killing off the zebra to cure
somebodies headache probably isn't a smart trade.
At the same time we need to be cautious, when we start playing large games
with the ecosystem, we incur the risk of large losses. (See: Just about every
invasive species introduced as a direct result of human intervention.) Right
now one of the large reasons for saving species from extinction is
biodiversity. The less biodiversity in the ecosystem, the higher the chance of
a key species being wiped out by disease or aggressive predation and
collapsing the whole system. (See: Pollinating insects.)
Right now the benefit to humans of saving as many species as possible from
invisible deaths by our machinations is a net positive. In the future, it may
make sense to "upgrade" organisms to give fitness for a particular purpose.
This process may involve organisms being out-competed to extinction by GMOs.
> Or that we can do whatever we like so long as the benefit is significant?
Well right now current meat processing procedures induce great pain to the
animals involved. This is an externalized cost that does not show up in
consumer prices. Meaning that even though the morally optimal solution would
be something like naturally grown animals on a large pasture, the
evolutionarily optimal solution ends up being animal death camps.
There are efforts to _grow_ meats independent of their traditional hosts
through bioengineering, but these efforts have not taken off yet. If
successful, they could save millions of animal lives, at the potential cost of
the majority of the species. (Or even all of it.) Is that worth it? Is it
selfish for humans to "obsolete" an animal it no longer finds useful? We did
this to horses.
We need to be careful about "benefit". The Soviets drained the Aral Sea in a
large irrigation project. [0] This of course resulted in the destruction of
the surrounding ecosystem, and is now apparently leading to health problems
for the people who live there locally. Was that worth it?
[0]:
[https://en.wikipedia.org/wiki/Great_Plan_for_the_Transformat...](https://en.wikipedia.org/wiki/Great_Plan_for_the_Transformation_of_Nature)
tl;dr: Currently the only way to satisfy our demand for meat is to slaughter
millions of animals. Because of market pressures, these animals end up dying
painfully, or even living painfully. This is not an morally optimal situation
and I feel that certain aspects could be improved, with potentially disastrous
consequences for the species involved.
~~~
earbitscom
"Semi-sentient"? "Smart trade"?
This attitude toward animals is just sad. We're talking about other beings
with emotions and desires. I wish scientists could GMO humans into not being
so arrogant toward other species. That's a genetic experiment I could get
behind.
~~~
unimpressive
I use the former term to clarify that I don't ascribe human levels of
consciousness to animals, do you?
As for the latter, since you brought it up:
The thing about stuff like "How many Zebras are worth a headache." is that if
you think about it enough, you end up at questions like "How many zebras are
worth a human?" and then "Are all human lives equal?". When trying to answer
such questions things get fuzzy and icky and hard to answer satisfactorily.
Then theres versions of those questions where you ask if the answer changes if
it's _your_ life being weighed.
Even though it's intuitively obvious that somebodies fleeting minor pain is
not worth the cost of losing an entire species of large mammal, talking about
"fair trades" with sentient creatures gets weird and frustrating very
quickly.[0]
[0]: I personally like the phrase "Infinite Hair" to describe situations like
this. (ftp://ftp.trailing-
edge.com/pub/rsx11freewarev2/rsx81b/374001/jargon.txt)
~~~
earbitscom
No, I do not feel the need to differentiate between the varying consciousness
of animals in a discussion such as this. I feel they are owed the basic right
to life, freedom from suffering, and an equal right to that which they have an
interest in, such as raising their young. For every qualification you could
find to rationalize treating animals poorly, aside from the species label we
have applied to them, the same qualifications can be found in certain people.
That's why it gets fuzzy and icky to start calculating "smart trades" that
compromise the rights of some for the good of others.
You should check out Animal Liberation by Peter Singer. I'm not saying I agree
with everything he supports, but he makes a bullet proof argument supporting
the fact that speciesism is just like any other discrimination, and relies on
the same logic as racism, sexism and so on.
------
adulau
[http://www.fda.gov/downloads/AdvisoryCommittees/CommitteesMe...](http://www.fda.gov/downloads/AdvisoryCommittees/CommitteesMeetingMaterials/VeterinaryMedicineAdvisoryCommittee/UCM224760.pdf)
"The potential hazards addressed in this EA center on the likelihood and
consequences of AquAdvantage Salmon escaping, becoming established in the
environment, and spreading to other areas. These hazards must be addressed for
the production of eyed-eggs, grow-out to market size, and disposal (i.e., of
fish & fish wastes)."
and
"As discussed in §2.4.2.1, the estimated escape rate of salmon from sea cages
is about 1%. Sea cages, or net pens, have a direct connection with the aquatic
environment."
1% of 50.000-90.000 fish in sea cage it's not something negligible, the first
direct impact might be against the non-GE Atlantic salmon (salmo salar).
Especially that salmon eats salmon eggs and so the dominant specie might
become the GE one.
You might say, oh it's fine it's not touching the diversity of Salmon, it's
just replacing the Atlantic salmon with another one. Wait, they took some
coding sequence from the pacific salmon (and especially from the oncorhynchus
tshawytscha) and added the "anti freezing" protein from zoarces americanus. So
this "subspecie" got an interesting level of properties to find its place in
the atlantic and/or in the pacific (where the diversity of salmons
(Oncorhynchus) is much higher) region. Those risks are without any spreading
of the genetic modification (assuming that triploid induction is effective
with a probability of 1, another point where the scientific literature is
lowering down the probability of effectiveness).
I suppose those risks are not really considered by the FDA as critical because
the F is for Food in FDA. So the risk of changing the whole profile of wild
salmon with such "GE" salmon is not negligible.
~~~
allerratio
> As a precaution, the fish are all female and contain three copies of each
> chromosome rather than two, rendering them sterile.
They won't replace atlantic salmon
~~~
jaequery
"But the sterilization process is not perfect; up to 5% of treated salmon
could still reproduce."
------
rjzzleep
i've stopped eating salmon a while back, the way theyre farmed is nothing
short of disgusting. I think a deathrate of 20% is considered good in those
tanks.
But seriously who wants to eat animals that live on corn? for a subtle
introduction on where the us meat comes from i suggest
<http://www.eatinganimals.com/>
~~~
mitchi
Everyone should rethink their meat consumption. First, if you read nutrition
books, you realize that in the long term, it's not healthy. It's much better
to eat vegetables. Secondly, you can get all the nutrients and vitamins you
need from whole foods. Thirdly, it can be cheap. A can of chickpeas is around
$1 and it really fills you up for many hours. I'm not saying everyone should
be vegetarian. But if everyone was 70 % vegetarian things would be very
different.
~~~
enraged_camel
>>First, if you read nutrition books, you realize that in the long term, it's
not healthy. It's much better to eat vegetables.
As someone who is eyeball-deep in nutrition science, I demand a credible
citation.
~~~
Borkdude
Read the books by Joel Fuhrman, Eat to Live being his most famous one,
[http://www.amazon.com/Eat-Live-Amazing-Nutrient-Rich-
Sustain...](http://www.amazon.com/Eat-Live-Amazing-Nutrient-Rich-
Sustained/dp/0316206644/ref=sr_1_1?s=books&ie=UTF8&qid=1356857742&sr=1-1&keywords=eat+to+live)
He's not against eating meat, but portions should be very low (5-10% of
calories from animal products, no more, and this is a veeery small piece of
meat). A (near-)vegan diet with B12- and DHA-supplements is the healthiest
(DHA can be derived from algae instead of fish oil - the fish get it from
algae too). An objection may be: but eating pills isn't natural. Well, eating
GMO-fish surely isn't natural either.
A website with a lot of little video's about the same nutrition principles can
be found here: <http://nutritionfacts.org/>
~~~
sliverstorm
If you say, "5-10% of your diet should be meat" you will find your audiences
much more receptive than if you say "You shouldn't eat meat". In my case this
is because I don't trust extremist _anything_ , particularly when it comes to
food.
If the former is your belief and not the latter, please open that way.
~~~
Borkdude
I was reflecting the message of Fuhrman MD. He's not saying 5-10% should be,
but COULD be meat or animal products (including eggs, dairy, etc). 0% is also
fine or even better. He allows this percentage because some people find it too
restrictive to be vegetarian or vegan. So optimally you should keep it as low
as possible, but there isn't much lost if you still keep it around 5%. Most
people don't realize one egg is already that amount. You'd have to eat vegan
the rest of the day to stay within limits. Animal products can be used as
condiments to flavor your food, not as a substantial part of your diet. If you
want to have a look at the scientific data, read his book(s).
------
jaequery
Upon further research, here are a few things that caught my attention:
#1) There was a bit of Congress lobbying to get it approved.
#2) There is a 5% chance that these fishes could actually become fertile.
#3) These fishes have an extra chromosome and a novel protein not found in any
other salmons.
~~~
cpa
Could you share your sources? (I'm genuinely interested vs. trying to be a
dick)
------
Surio
Dan barber's <http://en.wikipedia.org/wiki/Dan_Barber>
TED talk might be relevant to this discussion ;-]
[http://www.ted.com/talks/dan_barber_how_i_fell_in_love_with_...](http://www.ted.com/talks/dan_barber_how_i_fell_in_love_with_a_fish.html)
------
tomkinstinch
There are legitimate concerns surrounding genetically modified foods. Some
potential problems include the creation of monocultures that permit rapid and
more complete devastation from disease, selective pressure that creates
superior pests and predators (positive feedback toward creating monocultures),
negative ecological network effects, overuse of pesticides, and reliance on
commercial seed (you think software patents cause problems...). Food safety is
one, but it's a given for crops intended for human consumption.
That said, the real problem is fear from people who do not understand the
science. We've been modifying what we eat for thousands of years. Selective
breeding has allowed us to feed today's world. Development of short-stalk,
high-yield, disease resistant wheat by Norman Borlaug in the 1950s secured a
food supply for Mexico and India. His cultivars are now credited with saving
more than a billion lives. Where selective breeding got us to where we are
today, genetic engineering will carry us into the future. We need higher
yields per acre to support a growing population. That means industrialized
farming and cultivars that support industrial methods. So-called "Roundup-
ready" varieties of plants allow mass spraying of herbicides over fields, and
let desirable plants crow without competition from weeds (the plants have been
made resistant to the herbicide). Corn engineered to carry the "Bt" gene for a
bacterial endotoxin produces its own insecticide. In each case the
modification allows for a higher yield per acre. Some may say that selective
breeding is different than recombinant methods, which is true. Introducing
exogenous DNA into an organism is different than crossing two parent
organisms. The introduction of new DNA or new alleles does occur in nature
however, via viral insertion or mutation. Somewhere between 5% and 10% of our
own DNA is the result of viral insertions. Plants and salmon are effected by
viruses too. We are only accelerating evolution down particular paths (albeit
perhaps down very unlikely paths).
So genetic modification is not inherently harmful, and it can confer
wonderfully advantageous benefits. Baseless fear is unwarranted, but cautious
concern is justified. When having debates or reporting on GM foods it is
important that we discuss both the benefits of GMO foods along with the
concerns I mentioned above. It is prudent that any new organism be fully
studied and understood before it is deployed in the outside world, and at a
large scale. Environmental impact needs to be understood. In the US, GMO foods
must clear many regulatory hurdles before being approved. Among these are
complete characterization of the genomically integrated transgene(s) and
demonstration that the transgene(s) remains stable over multiple generations.
Apparently these and other criteria have been met to the satisfaction of the
reviewers examining the farmed salmon. Here is the environmental impact study
provided to the FDA by the salmon company:
[http://www.fda.gov/downloads/AdvisoryCommittees/CommitteesMe...](http://www.fda.gov/downloads/AdvisoryCommittees/CommitteesMeetingMaterials/VeterinaryMedicineAdvisoryCommittee/UCM224760.pdf)
Even though I support genetic engineering of foods, I am against labeling of
GMO foods as currently proposed because I think _a simple statement is
inadequate_. Saying "GMO food" says nothing about the modification, and only
serves to incite fear. In a time of rapid whole genome sequencing, I want to
be able to scan a QR code on a package and get a link to the GenBank entry for
the food organism in question, along with the impact studies and a plainly
worded overview. I want to see notes for which insertions, deletions, or other
changes were made. I want to see the DNA diff on the genomic source code.
Public food should be open source.
A few years ago I had lunch with Richard Stallman (a fun story itself), and
wanted to get his take on gene patents and "closed source" organisms. While he
expressed an understandable disapproval of corporate monopolies on crops and
biopharmaceuticals, he did not seem to have a strong desire for Freedom in
genetically engineered products. I found that surprising. He thought the
barrier to entry was too high for people to make their own genetic changes as
they might make changes to software. I think we need to consider that genetic
engineering is only going to get easier.
From some cursory reading it looks like the salmon in question in this article
has been modified with a promoter (kind of like a compiler flag to enable
production of a gene product) from the pout fish, and a subsequent growth
hormone gene from a different salmon species. It grows faster, is more
aggressive, and is sterile. It seems that the aquaculture companies interested
in using the fish intend to keep it isolated from the outside environment, but
even in the event of a release the fish would merely eat prey. They would
ultimately die. Since they are sterile, the likelihood is low that they would
be able to outcompete unmodified variants in their single generation. It seems
they are safe all around.
If anyone is curious, the inserted gene construct is known as opAFP-GHc2, and
its CDS source is available:
<http://www.ncbi.nlm.nih.gov/nuccore/56691717>
~~~
mitchi
Thank you for this information. And your statement "Public food should be open
source". That's something worth discussing as a society. Food is so
important... What do you think about Genetic Engineering for Humans. Is that
also the future? We can cut medical costs for everyone by engineering humans
to be immune to many known diseases. And better stuff.
~~~
tomkinstinch
Gene therapy is inevitable and is the future. Performing genetic engineering
on new human offspring is different, and is a debate perhaps best left to
ethicists.
| {
"pile_set_name": "HackerNews"
} |
How proud are Stanford Alums? - kremdela
http://stanford.buyalums.com
======
GFK_of_xmaspast
This looks like some kind of scammy thing, but I've known a lot of top people
from a lot of top schools and there's nothing like a Princeton alumn for
school loyalty (me personally I went to state schools and have been dodging
the fundraisers for years and years)
~~~
kremdela
I promise we aren't trying to be scammy. But I definitely appreciate the
cynicism.
I have little interest in collecting or doing anything with alumni email
addresses.
We are trying to validate that there is any demand for a directory like this
for our pitch to Alumni Offices.
Any responses we would get would be, by definition, a subset of the alumni
office list. We want to build a platform for alumni offices to showcase
accomplishments to their communities.
Presumably there would be a correlation between alumni pride and fundraising,
but we don't want that to be shoved down peoples throats.
| {
"pile_set_name": "HackerNews"
} |
Self-healing electric ink refuses to die when cut - jonbaer
https://techcrunch.com/2016/11/02/self-healing-electric-ink-refuses-to-die-when-cut/
======
xkcd-sucks
Cool, now we can actually light stuff on fire by overloading the conductive
ink
| {
"pile_set_name": "HackerNews"
} |
Python Enhancement Proposal 495: Local Time Disambiguation - philipn
https://www.python.org/dev/peps/pep-0495/
======
kbenson
I'm not sure the rationale behind this. That is, the rationale section of the
proposal does a poor just of explaining any case where this is actually a
problem.
In every case where I've seen this problem, it's a matter of people either not
storing the timezone along with the local time, or not storing in UTC time. A
local time with a timezone is a unique time, it does not occur twice. A UTC
time additionally does not occur twice. Store a time zone along with the date
and time or store in UTC and convert on use.
Note: If there are instances where a second is repeated, it's rare special
occurrence that developing a formalized interface for seems like overkill.
~~~
deathanatos
> A local time with a timezone is a unique time, it does not occur twice.
If you define timezone as an IANA timezone, this is incorrect: a whole slew of
local times repeat during a DST fallback event: you'll have a (1:30 AM
(dst=True), America/New_York), and then a (1:30 AM (dst=False),
America/New_York); that "dst=True|False" bit is the only difference, and that
needs to get stored. If you consider "America/New_York" to be the TZ, then
storing that bit on the TZ isn't appropriate, as it depends on a particular
timestamp.
If you've ever worked with PyTZ, there's a sort of rule of "just call
normalize() always"; otherwise, you'll get funny answers to some
introspections on the datetime instance: things like the offset being not what
a local would say the offset should be. My understanding is that pytz stores
the dst flag on the timezone instance itself; things get funny because the
timezone instance is not given a chance to update after arithmetic on the
datetime instance.
(Really, I feel like the whole thing would work better if there was a separate
class for "instant in time" and a function for, "convert this instant in time
to Gregorian year/month/day/etc. in this TZ", which then returned a broken-
out-type. (And a reverse, of course, for building "Instant" instances.))
UTC datetime + IANA TZ (if relevant) is the way to go. Alas, not all data is
so nice.
~~~
kbenson
Really I was thinking of it as a distinct timezone that must be tracked.
Whether the DST and non-DST versions label themselves as such, the
representation used to track time must distinguish whether DST is active or
not to display the correct local time. Really, when I say store local time +
timezone, I mean local time plus identifier that gets you to the same unique
timezone representation in your medium (python, in this case).
Personally, I just _always_ convert to UTC and store that. It changes the
problem from one of data fidelity to display or computation annoyance, and
annoyances are easy to reduce or eliminate with tooling.
~~~
deathanatos
> Really I was thinking of it as a distinct timezone that must be tracked.
> Whether the DST and non-DST versions label themselves as such, the
> representation used to track time must distinguish whether DST is active or
> not to display the correct local time. Really, when I say store local time +
> timezone, I mean local time plus identifier that gets you to the same unique
> timezone representation in your medium (python, in this case).
I guess that's my point: the IANA identifier is a well-known way to serialize
a TZ, but doesn't include DST flags because they're not relevant. I think if
you wanted to store something like a Python tzinfo object, the easiest way is
just storing (local time, offset from UTC); (maybe (local time, offset from
UTC, IANA TZ ID), if you want to keep the TZ)
tzinfo's don't really have a defining quality in Python, I've found. You can
end up — depending on libraries used — with two tzinfos that both conceptually
are "UTC", but don't compare equal…
Now that I've thought about it again, I'm not entirely sure that the DST flag
+ TZ name by itself is sufficient, mostly in the case of a TZ deciding to
change their offset.
> just always convert to UTC and store that. It changes the problem from one
> of data fidelity to display
The right thing to do, and for the right reasons.
------
toyg
Terrible PEP, I hope it gets rejected. One-off flags like this are hacks that
shouldn't be in stdlib. It simply stinks, in an area (time handling) where the
stdlib does not really smell like roses already.
Dealing with time adjustments is the OS's job, not userland. If your job has
to be scheduled exactly and cannot rely on the OS, and you refuse to deal with
UTC, it's your own damn fault and you can always just use a long-running
process with timers.
~~~
bliti
I agree. It's library baggage.
------
lifeisstillgood
So :
Time in UK is currently UTC +1 (BST) At 2am on 25 OCT we will return to GMT /
UTC. It will therefore become 1am, and for the next hour all times will have
happened before
The idea is to put a bit flag that says "alreadyseenthistime"
It seems to me this is a solution to the wrong problem.
Store all strings as bytes, assuming UTF-8, store all times as longs assuming
UTC
If we convert all python datelines to non-naive (ie embedded with a TZ) then
we are forced always to choose an encoding just like in strings. The right
encoding is to always assume incoming dates are UTC, to throw error if they
are non naive, and to assume that local clocks are set correctly (which we do
anyway)
I need to read it more carefully - but it seems the wrong solution
------
Marazan
In general any solution to datetimes that doesn't involve the time being in
UTC is solving thew wrong problem.
~~~
akvadrako
I would disagree. UTC is ambiguous and unpredictable. Computers should really
be based on GPST (seconds since 1980-Jan-6 UTC).
~~~
ubernostrum
If you want to be pedantic about getting rid of unpredictability in your time
measurement, use TAI.
------
IanCal
When the clocks change, don't you shift timezone? There aren't duplicate times
in BST, we just switch from GMT to BST and then back.
> In these situations, the information displayed on a local clock (or stored
> in a Python datetime instance) is insufficient to identify a particular
> moment in time.
Does the datetime instance not store the timezone?
~~~
msm23
Yes, you could get the information from the timezone, but how would one do
that in code?
The only time that one has the time fold is when you turn the clock backward
(let's just call that shifting from daylight savings to standard timezone).
And this would only affect code which used wall clock time (time as it's read,
e.g. 1:30am PDT), and would also only affect code which wanted to run
something only once at a time within that fold (e.g. 1:30am ... not on both
1:30am's).
So, using the timezone method, just check to see if your current 1:30am is in
your daylight savings timezone. Hurray! You're in the clear. Go ahead and do
that thing you wanted to do only on the first 1:30am.
But the next day you're going to run into a problem. The only 1:30am you're
going to get is in the standard timezone. So now you have to check for this
timezone change only on the day of the change, which is yet another piece of
data you have to keep track of. On the day of the change, do this timezone
comparison, and on every other day don't worry about it.
When the clock hits your interesting time of 1:30am, just check to see if
today is the day of the change, check what the current timezone is, check what
the daylight savings time zone is, check to see if those those two values are
the same, and now do your thing. Otherwise, just do your thing.
All of the above also ignores that people change times at different times
(11pm, 1am, 2am, 3am), some don't change a full hour, and some don't change at
all.
The proposal gets rid of all of that convoluted logic in everyone's programs,
and instead it provides a single boolean value: is this the second time I've
seen this time because of daylight savings shenanigans.
~~~
IanCal
> The proposal gets rid of all of that convoluted logic in everyone's programs
Does it? It doesn't cover the scheduling problem the other half of the year
when the clocks move the other direction.
> So now you have to check for this timezone change only on the day of the
> change, which is yet another piece of data you have to keep track of.
If running a job twice is a problem, then why not check that the job has not
already been run?
> is this the second time I've seen this time
Is this unambiguous? If it's 2015-10-25-01-30-00 GMT, have I seen that time
before? In the UK, yes, in Mali no.
------
mayoff
The more I'm exposed to other date/time libraries, the more impressed I become
with Apple's date/time library.
[https://developer.apple.com/library/mac/documentation/Cocoa/...](https://developer.apple.com/library/mac/documentation/Cocoa/Conceptual/DatesAndTimes/DatesAndTimes.html#//apple_ref/doc/uid/10000039-SW1)
| {
"pile_set_name": "HackerNews"
} |
Tipjoy (YC W08) now allows payments via Twitter - ivankirigin
http://tipjoys2cents.blogspot.com/2008/12/tip-via-twitter.html
======
ivankirigin
I'm really excited about this because of where we can go with this.
The idea of a "rtip" or tip/retweet is the biggest innovation here. It's
exactly how twitter is already used to disperse information, but adds a social
gesture with monetary weight. That's pretty powerful. If you like a tweet,
just say "rtip $1 @username the awesome tweet".
Lots of sites use Twitter credentials, and this means they can initiate
payments. It also makes those payments inherently social, as they are
broadcasted - so in ways it is better than an OAuth system. But we're planning
that too.
We are also accepted new signups via twitter credentials
<http://tipjoy.com/createaccount/platform/twitter/>
We're going to open this up to an API, meaning sites based on twitter
credentials can convert their whole user base to tipjoy users. I'm really
looking forward to see what can be done with these tools.
~~~
omakase
The potential for this idea is what got me excited the minute I saw your post.
As an aside, I think twitter still needs to release support for OAuth, there
are too many sites out there storing twitter credentials in plain text right
now. Careful developers should be hashing the passwords with some secret salt
so at least their users passwords aren't at risk if someone ever gains access
to their db.
These aren't really your issues, but twitter needs to add proper
authentication. Anyways -- this is a great from tipjoy -- looking fwd to what
comes next.
------
halo
I'm sure this wasn't 'inspired' by twitpay.me at all, released and announced
here 15 days ago (<http://news.ycombinator.com/item?id=369568>)
~~~
ivey
They say imitation is the sincerest form of flattery, so we at Twitpay are
taking it as a compliment.
~~~
Herring
With compliments like that, who needs insults?
------
juanpablo
Too many "pay words"! I don't know if that's the case but you should really
check they aren't part of a casual tweet.
Eg.
"I'm not going to pay @joe $1000 for an old ipod!" or
"Everyone! Give @bob $10 to buy beer".
~~~
ivankirigin
We thought of this. They need to be in the first 3 tokens.
So this would indeed count, "Give @bob $10 to buy beer" but that is clearly a
closer to a command.
More importantly, this is an opt-in option, so it would be hard to do it
without knowing.
------
Dilpil
Am I the only one who is EXTREMELY skeptical that anyone is going to actually
give out money using these applications?
------
mattmaroon
Nice feature. I would use this if I frequented Twitter. I have sick friends
who would throw money around like that for fun though.
~~~
ivankirigin
You tweet sometimes. They're usually pretty funny, so you should do it more.
I'm pretty sure twitter will be closer to an ecosystem of tips, considering
everyone is a "blogger" - though some with greater followings.
~~~
mattmaroon
Yeah, I log in a few times a week, and read whatever posts are on the first
page. That's about it.
------
bootload
_"... Tipjoy now allows payments via Twitter ..."_
Go Abby & Ivan
------
nostrademons
This is pretty cool. Congrats, TipJoys!
------
smahend
"inspired" is probably not the right word. i think directly stolen from
twitpay is probably better
~~~
calambrac
"Stolen", really? It's business, not super friendly fun time. Competition
happens.
~~~
ivankirigin
Tipjoy is all about super friendly fun time. Like at our poker nights. Come by
if you're in the Boston area (and a startup hacker/foudner):
<http://www.meetup.com/Boston-Startup-Poker/>
~~~
gwc
What if you're not, but would like to be?
~~~
lanceweatherby
Come up with a decent concept and team and then get funded by YC.
~~~
gwc
Seems a little involved just to get in on a poker game. :)
~~~
ivey
If you're already in Boston, the startup part is easy.
Idea+Incorporation=Instant CEO
He didn't say you had to be part of a _viable_ startup to play.
~~~
ivankirigin
Viable is determined on exit, right?
Anyone reading this is welcome to the game. And if you want an excuse to move
to Boston, tipjoy is hiring: <http://tipjoy.com/jobs>
~~~
ivey
Ivan, I assume you saw @tensigma's invitation on Twitter to play poker if
you're ever in Atlanta, but I'll mention it here, too. He's not any good,
though, so go easy on him.
~~~
ivankirigin
Even people that think they are good are still usually just playing by
intuition, like me. I'd love to play. We should set up a network of startup
poker games across the country. Poker is the new incubator.
Given this fundraising environment, it might be a good financing strategy too.
~~~
ivey
That was a session at BarCamp Atlanta: Startup funding via poker.
~~~
ivankirigin
Zynga does that, but plays the role of the house.
| {
"pile_set_name": "HackerNews"
} |
1978 – 'Farewell, Etaoin Shrdlu' - wglb
https://www.nytimes.com/times-insider/2014/11/13/1978-farewell-etaoin-shrdlu/
======
teilo
I recommend another documentary for anyone interested in the history of the
Linotype:
[http://www.linotypefilm.com/index.html](http://www.linotypefilm.com/index.html)
Used to be on Netflix, but not presently.
------
schoen
Despite the year in the title, this is really a (2014) obituary for a _New
York Times_ Linotype operator who died in 2014 and was profiled in a 1978
documentary.
| {
"pile_set_name": "HackerNews"
} |
Major setbacks for two new smartphone OSs, Tizen and Ubuntu Touch - cpeterso
http://gigaom.com/2014/01/17/major-setbacks-for-two-new-smartphone-oss-tizen-and-ubuntu-touch/
======
mojuba
I have an impression that the goal of all these new iOS/Android challengers is
to get Linux on the mobile phone at all costs. Whereas the goal should be to
create user experience that would be at least as pleasant and smooth as that
of iOS, plus openness and transparency of the platform. At least.
That may imply Linux due to the lack of viable alternatives, but really, it
should be built with users in mind, rather than perceived as a technical
challenge of putting Touch on top of Linux.
(And please, stay away from Java in your next mobile OS. Among other things,
you will ward off those of us, developers, who let's say are not neutral wrt.
to certain programming languages. Edit: make Java optional if you wish.)
~~~
fafner
This has little to do with getting Linux on mobile. Maybe it's the motivation
for some. But not the driving aspect.
Samsung wants Tizen to gain back full control of their platform. Canonical
wants Ubuntu Touch (and Mozilla Firefox OS) because the PC is getting less
important for consumers.
> (And please, stay away from Java in your next mobile OS. Among other things,
> you will ward off those of us, developers, who let's say are not neutral
> wrt. to certain programming languages. Edit: make Java optional if you
> wish.)
Neither of them is based on Java. Tizen and FirefoxOS are (sadly) JavaScript
only. Ubuntu Touch supports JavaScript/HTML5 and Qt (JavaScript + C++).
~~~
jevinskie
Tizen has a native C++ toolchain just like the Android NDK.
~~~
pjmlp
Last time I looked it had the frankenstein Bada influence, with two level
initialization, no exceptions, handles vs pointers and so forth.
Is that still the case.
~~~
jevinskie
I'm unsure, I haven't done any app development, just toolchain work.
------
hdevalence
It's a shame the way that Canonical have isolated themselves from the rest of
the Linux ecosystem. I wonder if the future for Ubuntu Touch would be brighter
if they had more community appeal.
~~~
wtracy
How so? Android and Firefox OS can run few if any GNU/Linux applications
without major modifications. Ubuntu run every Linux app I can think of out of
the box.
~~~
hdevalence
I was referring more to isolation in two related senses:
1\. A technical sense: Canonical has a significant amount of distribution-
specific software that has very little adoption outside of the Ubuntu
ecosystem. Examples include:
* their own version control system, bzr (which I think has now been essentially abandoned in terms of development, and other than a few GNU projects never saw much uptake at all);
* their own desktop shell, Unity, which to the best of my knowledge has never been properly packaged for any non-Ubuntu distribution; their own display server, Mir, whose development is the subject of much controversy, and which has been rejected for well-founded technical reasons that have probably already been discussed at length on HN;
* their own init system, Upstart, which was initially adopted by other distributions, who then abandoned it in favour of systemd due to its fundamentally flawed design (for instance, there is no way to 100% reliably keep track of a process's children and the dependency resolution runs backwards, meaning that some configurations can cause hangs).
2\. A social sense: Canonical's relationship with the rest of the Linux
ecosystem has soured (beginning, in my opinion, around 2009-2010, but as a
gradual thing, it's hard to pin down) to the point of outright hostility. In
my opinion, most of the responsibility for this lies with Canonical's poor
communication with others. There's a very detailed history of some events from
2009-2011 that you can read here [1] detailing their relationship with GNOME.
The culmination this year has been disgraceful personal attacks by Mark
Shuttleworth on people who raise technical criticism of Canonical's software.
I expect that more will follow whatever the fallout of Debian's systemd
decision is.
The two are obviously related. The big problem is that there's (again, in my
opinion) very little community appetite for contributing to Canonical's
software. And why would there be, given that in order to submit patches for
Ubuntu Touch, I need to give Canonical permission to relicence my code under
any proprietary licence they choose? At that point, it's not a community
contribution, it's just working for Mark Shuttleworth for free.
Does Canonical have the software engineering capability to maintain a bunch of
stuff (an init system, a display server) that's really unrelated to their
problem (making awesome products for users) and still be able to take on
Android at the same time? I don't think so.
[1]: [http://bethesignal.org/blog/2011/03/12/relationship-
between-...](http://bethesignal.org/blog/2011/03/12/relationship-between-
canonical-gnome/)
~~~
jmhain
Unity has been packaged for Arch. I have absolutely no idea why, but it has.
~~~
RBerenguel
That tendency of Arch users to port everything and try everything (Arch is my
favourite Linux distribution, I quite like this style)
------
pinaceae
As long as they do not solve a problem for either the carriers or the
consumers that is not solved by iOS or Android, they have no chance.
Carriers and manufacturers love Android because they get it from Google for
free. Cosumers love it because they can get cheap, good smartphones.
Carriers love the iPhone cause it generates massive revenues for them as
customers love it. See what happened with NTT doCoMo as an example.
Tizen, ubuntuOS, firefoxOS, even Windows Phone do not solve anything between
those two. "Openess" is not a problem for the above.
~~~
lnanek2
Sometimes an extra option is desired by companies like carriers just so they
can pit all the options against each other and get better deals. Similar
happens in the OEM space. An OEM may be happy with Qualcomm, but they are
going to flirt with NVIDIA and use them occasionally just to get better deals
from Qualcomm.
~~~
rimantas
This only works with options which are also desired by customers.
------
rquirk
Regarding the first-edition Nexus 7 - Canonical will no longer produce Touch
images for it, so the author of this article is going to be even more
disappointed during 2014.
[https://lists.launchpad.net/ubuntu-
phone/msg05889.html](https://lists.launchpad.net/ubuntu-phone/msg05889.html)
------
frik
Samsung should release high-end Tizen (Bada3) smartphones in Europe. Most use
Samung devices here with Android. If Samsung keeps its Touchwiz UI and ship an
Android compatibility layer most people would buy it anyway. They buy it
because of the Samsung brand name!
Bada was successful in 2011 and 2012 (3Q/2012: 5.054.000 world wide sales,
3.0% market share). Bada sales were higher than Win7/8 smartphone sales even
back in 2012:
[http://upload.wikimedia.org/wikipedia/de/e/e7/2012_11_15_Sma...](http://upload.wikimedia.org/wikipedia/de/e/e7/2012_11_15_Smartphones.jpg)
~~~
Oletros
Android compatibility layer doesn't include Google Services so no GMail,
Google Maps or other programs.
------
Daishiman
This was not unexpected; despite all the technical and political troubles iOS
and Android have, mobiles OSes can be considered a commodity; there is little
competitive advantage that any new contender could bring about at this point
unless something truly radical emerges.
Ubuntu's unified OS on all platforms seems like an interesting idea at first
glance, but Canonical has so far failed at providing the necessary vision for
that to come through.
~~~
argonaut
It's not so much that they're a commodity, but rather it's the fact that
Android and iOS are so entrenched. Both ecosystems are networks. In fact iOS
is a major competitive advantage for Apple, with all the apps that are iOS
first.
------
diminish
Mobile OS seen is amazingly diverse; Symbian, WebOs, Blackberry OSes, Bada,
->Tizen, Ubuntu Touch, FirefoxOS, SailfishOS and then Android/s, Windows* and
iOS. Anything else I omitted?
~~~
xamlhacker
There is also S40 platform powering Nokia Asha pseudo-smartphones. S40 is not
Symbian, it is a distinct OS.
------
programminggeek
I think at this point the smartphone OS's that are new Firefox OS, Ubuntu, and
Tizen are bascially an attempt to stay relevant in a world where the primary
computer is a phone or tablet, not a laptop or desktop.
All three of them exist more because of the companies that built them, than
because there is a big gaping hole in the market that they fill. Without a big
problem they are solving I think the market as a whole will shrug at their
arrival sort of like the Palm Pre.
~~~
jebblue
>> in a world where the primary computer is a phone or tablet, not a laptop or
desktop.
Did you write that comment on a phone or a personal computer?
------
cliveowen
Major setbacks for two new smartphone OSs, Tizen and Ubuntu Touch. Namely,
people don't give a single shit about them.
------
dscrd
Sailfish OS is built on all the same technologies that are upcoming in the
desktop/server linux world, namely systemd, pulseaudio, wayland. They even use
BTRFS for the root filesystem.
_That_ is the real Linux phone.
~~~
tcfunk
Nothing that requires the end user to perform the installation of the OS is
the real anything.
| {
"pile_set_name": "HackerNews"
} |
Aug. 1, 2012: When Oculus Asked for Donations - amitkumar01
http://blogs.wsj.com/digits/2014/03/25/aug-1-2012-when-oculus-asked-for-donations/
======
modeless
This is the real shame in the Oculus acquisition. If Oculus had been able to
give away just 10% equity, every single Kickstarter backer would be $20,000
richer today. The sentiment would be completely different. The SEC needs to
get in gear and start allowing equity crowdfunding pronto.
The SEC's classification of millionaires as "accredited investors" who get
first crack at all the best investment opportunities is exactly the kind of
"rich get richer" policy that people ought to be furious about. I think not
enough people understand it.
~~~
icpmacdo
If a crowd funding website that was completely bitcoin based would it be able
to avoid all SEC regulations? If there are regulations in the future is there
a liability to creating this now?
~~~
anigbrowl
It's better to address the problem directly (investor eligibility) rather than
trying to invent technological end-runs around it, because the law is not
deterministic and such technological instrumentalities are very transparent to
regulators..
------
ama729
From Notch:
"And I did not chip in ten grand to seed a first investment round to build
value for a Facebook acquisition."
[http://notch.net/2014/03/virtual-reality-is-going-to-
change-...](http://notch.net/2014/03/virtual-reality-is-going-to-change-the-
world/)
------
cookingrobot
For $300 those backers got the promise of an awesome VR headset. I don't see
why they should get equity for free on top of that.
It would be nice if kickstarter campaigns could add equity as and option in
the reward tiers, but it's not like these people gave their money for nothing
in return.
~~~
kevingadd
You're assuming every backer paid 300. Some donated more, some paid less.
~~~
dlp211
And? They knew the terms of the deal.
~~~
hrjet
This is not a legal proceeding. It is more of an emotional outpouring of
disappointment.
The unwritten deal was that the funders were backing a low-overhead, indie
organisation.
~~~
joesb
So they can never becomes big company?
It's like a pathetic jealousy of wanting to help a homeless as long as you
know they are never going to be richer than you.
~~~
hrjet
You are conflating _big_ with _rich_. It is possible to dislike the former and
yet support the latter.
Often, when companies get _big_ , the motives change from "making a great
product" to "milking the user for more money".
------
phantomb
"a $2.4 million early-stage investment in what would become a $2 billion
business in a year and a half, in return for 0.0% equity" No. Those backers
weren't investing in a company, they were purchasing a prototype device and
software package. In practice, that's how Kickstarter works. Sure, there will
be a lot of backers who are happy that their deferred purchase is helping a
project actually get off the ground, but I guarantee every person who handed
over the $300 read over the rewards carefully.
------
shittyanalogy
1) Kickstarter is not an investment platform.
2) Kickstarter is not a store.
Kickstarter is ONLY for _giving_ your money away to ideas you want to see
succeed. The _perks_ are a gamble at best and misleading at worst.
If you give money to a kickstarter for any other reason, despite what the
campaign or your friends or some blog tells you, you're quite unfortunately
doing it wrong.
~~~
nemetroid
The perks are _not_ a gamble. If the project creator fails to deliver, you are
entitled to a refund. Straight from the Kickstarter Terms of Use:
> Project Creators are required to fulfill all rewards of their successful
> fundraising campaigns or refund any Backer whose reward they do not or
> cannot fulfill.
~~~
shittyanalogy
Through what enforcement?
------
mehwoot
Well you can't have it both ways. If you put your money in on kickstarter and
expect to get a product in return, you have bought the product. You didn't
invest.
Of course, how much people who put money into things on kickstarter expect to
get a product versus how much risk they are expecting to take is an open
question.
------
booruguru
I've always found it bizarre that people would willing give hundreds (if not,
thousands) of dollars to fund a business venture for some paltry token of
appreciation while the founders (and investors) receive all of the financial
rewards. (The "Veronica Mars" movie comes to mind.)
Also, I find it deeply upsetting that I am be allowed to give my life savings
to a startup founded by a friend or family member, but I can't do this with a
stranger...because the government wants to protect me or something.
But, really, I can't really blame regulators for this paternalistic policies
since most people blame the banks for the financial crisis while pitying those
impoverished home owners who stupidly purchased homes they couldn't afford.
~~~
interpol_p
It wasn't a "paltry token" of appreciation in this case. It was the actual VR
headset, unlike anything else you could purchase commercially, for $300. I am
glad I got mine and knew what I was getting when I backed the project — a
really fun toy I could mess around with.
------
icambron
I'll just say it: Kickstarter is for suckers. When you give money to a
project, you're doing one of two things:
1\. Making a donation to a company. [1]
2\. Preordering something that hasn't been built yet.
Doing 1 is silly, since you don't really get anything in return. "But it makes
it more likely that this thing I want will happen!" In some tiny marginal way,
sure, but mostly it's going to happen because other people donate (or fails to
happen because they don't). Don't be the fool who tries to personally take on
the collective action problem. And stop trying to make other people rich out
of the goodness of your heart.
Of course, as the WSJ fails to make clear, most of Oculus's Kickstarter money
wasn't straight-up donations; it was preorders of the Rift. That's obviously
not a donation, but it's not a good idea either. As the buyer, you bear the
risk that it never ships at all. "But I'm compensated with a discount!"
Essentially, you're making an investment in which your returns come in the
form of future discounts on a product. Forget that you like the Oculus Rift
for a second; is this a wise investment structure? If someone set up a VC
company that did that instead of buying parts of companies, would you think
that was smart? Did you do any kind of analysis that suggests this is actually
works out to be a good investment? Do the potential returns even justify that
analysis? Do you think of other consumer products this way, or only shiny
electronic things?
Or to think about it a different way: imagine if someone set up a store that
worked like this: you take your item to the counter, where they don't actually
let you buy the item. Instead what you can do is pay the price minus n% and
then they roll this big roulette to decide whether you get the product (m%
success rate). If you win you get to keep the product and if you lose it goes
back on the shelf and they keep your money. To spice things up, they don't
tell you what n and m are either, just the price to play and whether you get
the item. Now, it's possible--though unknown--that m and n work out that
you're EV positive here. But would you really shop at that store? Especially
when there's another store next door that just sells you the same stuff at a
known price (i.e. just buy the Rift when it comes out).
The fact of the matter is that you're aren't pre-buying the Rift on a rational
basis. You've been convinced by clever marketing to shoulder risk for a
company because it _seems cool and feels good_. Total sucker move. That
probably explains why it tastes bitter when the company whose capital
requirements you fronted rolls that into a $2 billion dollar acquisition.
[1] Maybe it's not a company. Maybe it's a cause you support like improving
CoffeeScript or something. For those cases, I withdraw my objections.
~~~
InclinedPlane
Who cares?
You seem to live in a universe where everything goes according to plan, that's
not any real universe I'm aware of.
Kickstarter is about being a _patron_ of _creators_ for specific projects.
Sometimes those projects are by people working in good faith, sometimes not.
Sometimes those projects succeed, sometimes not.
The idea that this is somehow unusual is ridiculous. The idea that it should
be discouraged is actively harmful. One of the most powerful things anyone can
do with their money is to fund the development of things which change the
world. Create products they wish existed, create art they desire, support
creative or productive people, help others, etc.
------
daemin
So perhaps the answer is for Kickstarter to not allow any sort of companies to
be funded, but rather only keep it for once off art projects...
~~~
phantomb
I understand that's what the Kickstarter team is most interested in. But what
would that accomplish besides causing all the companies to simply move to the
next crowd-sourcing site?
If Kickstarter doesn't want a cut of that money I'm sure Indiegogo will be
happy to take it
~~~
daemin
It's a tough question Kickstarter might ask itself, but then again since it
gets a cut of all the funding that comes through it, why would it want to stop
(over)funding these company projects. I think at this stage it won't change
unless it is forced to from a third party.
------
blazespin
Actually, in all of this anger, there is great news. This is validation of the
crowdfunding format in a big big way. I hope this is just the start (well, the
start of getting actual ROI for your crowdfunding investment. Enough of this
Donation silliness..)
------
shurcooL
This is all that I'm reminded of:
[http://mashable.com/2010/04/02/facebook-acquires-
divvyshot/](http://mashable.com/2010/04/02/facebook-acquires-divvyshot/)
------
dalek2point3
my complaint is mainly with the headline. if the word "kickstarter" was
anywhere in the title I would not have clicked. d'oh.
~~~
maxden
and the headline has changed again...
it was "18 months ago Oculus did one of history's smartest rounds $2.4M for a
0.0% stake (wsj.com)
------
fiatjaf
If the government let it, every small company or utopian project would sell
stocks, not empty dreams, to believers in the internet.
------
refurb
Did Oculus raise more money after the Kickstarter or did these guys own 100%
of the equity?
~~~
djloche
They did a $75M round recently and have been hiring like crazy.
~~~
jes5199
at this point, we should read "hiring like crazy" as "trying to get acquired".
Having a lot of employees boosts your valuation but does little for your
ability to deliver a product
| {
"pile_set_name": "HackerNews"
} |
Vesper app - pstinnett
http://vesperapp.co/
======
pooriaazimi
( _please note that my post is meant to give others another use-case for the
Vesper app, and isn't an ad for a competitor product!_ )
I don't care much for a diary or travel book, but I do use DayOne[1] (for OS X
and iOS) as a journal of interesting stuff I find (mostly on HN). Years ago I
used delicious to save/tag webpages, after its stagnation I used other
services (I don't even remember their names), then started tagging stuff in
Evernote, but none of them were as natural and easy-to-use as DayOne. It lacks
a lot of features, but the Markdown format and its general ease of use makes
up for that - I just press Control-Shift-D and start typing (or pasting a URL
+ it's HN discussion link for future reference). Occasionally I write down an
interesting quote or image, or a passage I've read in an article. Also, the
way it stores entries is using plist files, so I'm not afraid of platform
lock-in. If I find something better (like this Vesper app here), I'll just
write a convertor and translate those .plist files to the new format.
I haven't tried Vesper yet (probably will wait until there's an OS X client,
which is where I use DayOne 99% of the time), but it looks very nice and
promising.
[1]: <http://dayoneapp.com>
~~~
kmfrk
Looks like they have punted on encryption for over a year, which seems like a
very weird and careless thing to do:
1\. <http://dayoneapp.com/support/passwords/>
2\. [http://iphone.appstorm.net/reviews/lifestyle/day-one-a-
gorge...](http://iphone.appstorm.net/reviews/lifestyle/day-one-a-gorgeous-
synchronized-journalling-app/#comment-894327741)
3\.
[http://web.archive.org/web/20120902234719/http://dayoneapp.c...](http://web.archive.org/web/20120902234719/http://dayoneapp.com/support/passwords/)
The app looks really interesting, though, and I would probably have used it
otherwise. Maybe now's a great time to address encryption with the focus on
the Vesper app.
~~~
ozarius
If encryption is your main requirement, you could try AnyLocker -
<http://anylockerapp.com>
It's main selling point is that it encrypts all your data (notes, pictures)
with SHA-256 before storing it on your device. Give it a try (Disclosure - i
am the developer in the team that published this app)
Reviews/feedback always welcome...
~~~
kmfrk
Nah, I already have Ben the Bodyguard[1] for that.
I don't really use it, but I use a to-do app constantly, and miss a tool to
continue writing on my projects' Markdown files.
[1]: [http://benthebodyguard.com/](http://benthebodyguard.com/);
------
aarondf
Seems strangely devoid of information... Am I supposed to already know what
vesper is?
~~~
Samuel_Michon
A ‘Vesper’ is a cocktail drink that was introduced in Ian Fleming’s novel
‘Casino Royale’. John Gruber is a fan of the James Bond stories and of high
alcoholic beverages. I don’t see how the name has anything to do with note-
taking, though.
<http://en.wikipedia.org/wiki/Vesper_(cocktail)>
~~~
fbpcm
The app is sold by Q Branch LLC
~~~
Samuel_Michon
Yes, which is the company Gruber started with his friends. The kid in all the
app’s screenshots is Jonas Gruber, John’s son.
Like Vesper, the name ‘Q Branch’ also refers to the James Bond franchise (Q is
the mastermind who provides James Bond with all of his clever gadgets).
[http://www.macworld.com/article/2040883/meet-vesper-a-
notes-...](http://www.macworld.com/article/2040883/meet-vesper-a-notes-app-
with-an-all-star-development-team.html#tk.twt_jsnell)
------
AliAdams
I'd love to know what the background/reasoning was for another note-taking
app. Sounds like a great team but like some other people have said - it
doesn't look like a brilliant space to start out in.
------
jasonpbecker
The MacStories Review [1] has a gif that shows how the app transitions from
screen to screen. To me, that's the most interesting part of the app because
it introduces a new UI/UX that feels faster and seems unique. Who knows if
this will catch on, but it could be something not unlike pull-to-refresh, the
hamburger button, or the basement metaphor on iOS if people really like it.
So maybe it's just another note app that won't work for those of us already
using a different solution. But the UI/UX design is pretty cool and
potentially noteworthy.
[1]: [http://www.macstories.net/reviews/vesper-review-collect-
your...](http://www.macstories.net/reviews/vesper-review-collect-your-
thoughts/)
~~~
cloudwalking
Gif direct link:
[http://90a2cc4b7deb50ba0492-5793e9161196cd023f2e1f1322f2910e...](http://90a2cc4b7deb50ba0492-5793e9161196cd023f2e1f1322f2910e.r22.cf1.rackcdn.com/2013-06-06%2015-24-46-vesper-
vesper.gif)
------
coderguy123
I just don't get this app. There are thousand alternatives cheaper or free.
------
meerita
What about a web version of this? If you want to write more than 100 words the
iPhone becomes something really annoying. I hope they release, at least, this
option some day.
------
fromwithin
Android version in the works?
~~~
perishabledave
Given the team that made it (i.e. John Gruber, Brent Simmons, Dave Wiskus), I
probably wouldn't expect one any time soon. Not merely referring to their
personal preferences, with only one developer on their team I can't imagine
they have much resources to go cross-platform.
~~~
fakeer
Any time soon?
Even if the other two decides to do it Gruber might threaten to jump off a
bridge!
------
capkutay
I thought about making an app like this for a long time. Kind of like indexing
your thoughts. However, I feel like evernote may have taken the space in a way
that wouldn't allow any competing apps to scale.
------
oakaz
It looks good but damn so difficult to install. it doesn't appear in appstore!
~~~
perishabledave
I think the appstore updates 1 PM PST. You'll probably see it then.
------
logical42
i really like it!
feature request: allow for a lock-code on this app please.
| {
"pile_set_name": "HackerNews"
} |
Restoring accidentally deleted files on Linux - potus_kushner
https://sabotage-linux.github.io/blog/6/
======
rnhmjoj
If you are using ext4 there is extundelete that simplifies the process, it
requires to remount the filesystem read-only, though.
mount -o remount,ro /dev/sda1
extundelete --restore-all --after $(date -d "-1 hours" +%s) /dev/sda1
find /RECOVERED_FILES -name accidentally_deleted_file
mount -o remount,rw /dev/sda1
------
theamk
Another fast way to search for text in binary files is to use "grep -abo"
(treat all files as text / print matches only / print file offsets).
------
dusted
Anyone checked if that is actually faster than just using grep on the
blockdevice? I've used grep the few times I've had a need for it, and just
copied the source off the terminal when its found something. (Tell grep to
give you some lines before and after a match, and tell it to treat the
blockdevice as plain-text).. You usually get some trash before/after the
matches, those could be terminal control characers, but a those can be trimmed
with, drumroll: tr.
------
rrauenza
Last time I had this happen (thankfully a long time ago) I just used Perl
against the block device and used a regex.
Forensic tools also work well for this.
------
aurox
Thanks, this will come in handy. Needed it a bunch in the past.
------
jvanderbot
shouldn't that be (addr1-addr2+1)? Or does dd copy the first byte _plus_ the
count?
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Interested in a Lisp Google Calendar API Library? - gibsonf1
Is anyone else interested in a nice, easy to use and reliable Lisp library to talk to the Google Calendar API in an elegant way? I sure am, and if you are too and like to code in lisp, please email me to join the team adding this little gem to a growing list of very helpful Lisp libraries. (We'll of course use one of the existing xml lisp libraries)
======
gibsonf1
I've just been on #lisp, and pkhuong offered this code to the effort:
<http://www.discontinuity.info/~pkhuong/cl-gcal.lisp>
| {
"pile_set_name": "HackerNews"
} |
Coronavirus Conspiracy Theories Are a Public Health Hazard - vo2maxer
https://www.wired.com/story/coronavirus-covid-19-misinformation-campaigns/
======
TravelN0mad
Wait for it, they're going to use this as a pretext to extend their already
rampant censoring-, and account lockdown program.
| {
"pile_set_name": "HackerNews"
} |
Governments will close down bitcoin and crypto if they get too big, Jamie Dimon - SirLJ
https://www.cnbc.com/2017/09/22/bitcoin-jpmorgans-jamie-dimon-lays-into-bitcoin-again.html
======
garamirez
I really like JD's comments on this Bitcoin madness. This is going to end
sooner than later and people is going to get hurt (in their pockets). Greed is
good.. but at a reasonable level.. all cryptocurrencies are being fueled by
greed.. nothing else.
------
byoung2
_Dimon warned that governments will eventually crack down on cryptocurrencies
and will attempt to control it by threatening anyone who buys or sells bitcoin
with imprisonment, which would force digital currencies into becoming a black
market._
Couldn't that make it more valuable? Take illegal drugs for example...they are
very expensive because they are illegal to produce, distribute, but, and sell.
| {
"pile_set_name": "HackerNews"
} |
Haiku R1/beta1 has been released - hakfoo
https://www.haiku-os.org/news/2018_09_28_haiku_r1_beta1/
======
artsyca
Posting this from HaikuOS! Greetings earthlings! Man I feel like I've opened a
time capsule.
~~~
fit2rule
Tell us more about it -- what have you discovered? What do you like so far?
Can you use it productively for things?
~~~
artsyxxx
I'm still discovering what's going on with this OS. Apparently we can install
VMWare tools. It's been over 15 years since I've touched anything to do with
BeOS but in the words of Celine Dione.. It's all coming back to me now
------
mntmn
Incredible progress, very cool! I used BeOS back in ‘99 and loved it.
Applications were straightforward to build with C++ and the BMessage system.
It had a great little community, even in Germany (“BeGeistert”!). Does Haiku
run on ARM nowadays?
~~~
deaddodo
Yes:
[https://download.haiku-os.org/nightly-images/arm/](https://download.haiku-
os.org/nightly-images/arm/)
It's even buggier than the x86 version though, and you don't have any software
beyond what you compile for it yourself.
~~~
tossaway44
I thought the ARM version didn’t have a GUI yet...
~~~
deaddodo
The Application Server is in progress, but it builds. The problem is that on
x86, you can fall back on the BIOS w/ VESA to display video; while ARM lacks
that functionality. If you can find/build a video driver it should Just
Work(tm).
You can see the current video drivers here:
[https://github.com/haiku/haiku/blob/master/build/jam/images/...](https://github.com/haiku/haiku/blob/master/build/jam/images/definitions/regular#L205)
Which are all x86/amd64-based. The RPi would actually be relatively simple to
throw a framebuffer video driver together for, though:
* [https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/os/s...](https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/os/screen01.html)
* [https://github.com/raspberrypi/firmware/wiki](https://github.com/raspberrypi/firmware/wiki)
* [https://github.com/dwelch67/raspberrypi/tree/master/video01](https://github.com/dwelch67/raspberrypi/tree/master/video01)
~~~
waddlesplash
Err ... no? The problem with the ARM port is the lack of device drivers, and
the fact that there is no userland/syscalls ABI set up yet at all. So there's
still a lot to be done here.
------
smallstepforman
And the release notes:
[https://www.haiku-os.org/get-haiku/release-notes/](https://www.haiku-
os.org/get-haiku/release-notes/)
~~~
AHTERIX5000
Loving how they list Pentium II as a minimum requirement. I still have an old
netbook with Intel Atom that I use when travelling to certain parts of the
world and while it is clocked at 1.6 GHz and has 2 GB of RAM many of the
modern Linux desktops are extremely sluggish.
~~~
morganvachon
Haiku would definitely bring it to life. I had a similar Atom based netbook
and it was a perfect Haiku machine. I also have a PIII based laptop from 2000
with 512MB RAM that runs Haiku (and BeOS 5.0 Pro) extremely well.
~~~
UncleSlacky
Good news. I'm just about to try it on my Asus eeePC 701 4G, I've upgraded the
RAM to 1Gb and have an 8Gb SD card for additional storage. Fingers crossed!
~~~
david-given
I have an eee 701 --- there are graphic driver problems but it works find in
safe mode. (I think there's something weird about the eee's smaller-than-
normal screen.)
Incidentally, I recall the early eees (like mine) having a faulty SD card slot
and were unreliable writing to SD cards. Real MMC cards work fine. But that
was years ago, so maybe there's a workaround now.
~~~
UncleSlacky
Thanks for the graphic driver tip. I've never had an issue with SD cards, but
the 4G is (I think) the last iteration of the original 700 series, so
presumably it was fixed by then.
------
swerner
This is a very faithful reimplementation if BeOS, which means it shares one
unfortunate limitation with it: The lack of drivers.
Already back in '99, not leveraging 3D hardware was a problem for adoption.
These days even more so, as not only 3D software but also 2D, audio and
scientific software leverage GPUs.
~~~
bebop
There has been discussion [0] on porting FreeBSD's linuxkpi framework to get
Linux drivers working on Haiku. So far I do not think anyone is actively
working on it, but it would be a very similar solution as was used to get
FreeBSD's networking drivers working on Haiku.
[0] - [https://discuss.haiku-os.org/t/plans-
for-3d-acceleration/727...](https://discuss.haiku-os.org/t/plans-
for-3d-acceleration/7272/8)
~~~
swerner
That would be wonderful. If I could get OpenGL support for current generation
NVIDIA or AMD hardware, ideally with OpenCL or Vulkan too (CUDA is probably
too much to ask for), I would even be able to use Haiku for my daily work.
------
lcnmrn
C’mon Mozilla, make a smart move and bring Firefox to Haiku OS. Just kill that
Chrome OS thing.
------
tcbawo
From what I've read in the release notes, Haiku uses GCC version 2 for ABI
compatibility reasons. Can anyone elaborate on whether this is the case, or
whether modern GCC can be/has been modified for the OS to maintain binary
compatibility. Also, what is lost in this case?
~~~
morganvachon
There are two versions of Haiku, the hybrid 32-bit version uses the older GCC
for compatibility with software written for BeOS, plus GCC 7.3.0 for modern
app support. The other is a 64-bit release with only the modern GCC and no
backwards compatibility with BeOS apps.
[https://download.haiku-os.org/](https://download.haiku-os.org/)
~~~
tcbawo
What aspect of newer versions of GCC breaks the ABI? Are these things that
can't be controlled?
~~~
bebop
First, to add to the parent comment, you can install later versions of GCC on
the x86 gcc2 builds as well. There is also a port of LLVM.
Second, not directly related to what changed with the ABI, but it I think it
would be more hassle than it is worth to try and make newer versions of GCC
emit compatible binaries. There is work to run Haiku GCC 2.95 libraries
alongside binaries built with a newer version. The runtime loader would be
responsible for picking the correct so files. This also would work on the x64
builds as well, similar to how Linux (and others) can load x86 programs on x64
OS's. This feature is close to working but there were some last minute issues
found, so it was left out of the Beta.
~~~
stuaxo
Ah, so BeOS programs could work under 64 bit haiku?
~~~
bebop
Yes, that is the plan if I understand correctly.
------
grafelic
My Thinkpad 240 is ready, now waiting for an USB to IDE adapter to install
Haiku on its disk.
~~~
netrap
I tried on mine but it wouldn't boot last time.
------
cgb223
What are the advantages to using HaikuOS vs other OSes?
~~~
smallstepforman
Same dev team does whole stack, from kernel, kits, drivers, servers to user
space. In theory, it is all integrated and has direct path from server to
kernel, resulting in more efficient system.
BeOS core API was designed for multi core systems from day one, so the design
is more modern than older systems.
The core API is C++, from kernel to user space.
Lots of things to like, it just needs more polish and devs to flourish.
~~~
tialaramex
> Same dev team does whole stack, from kernel, kits, drivers, servers to user
> space. In theory, it is all integrated and has direct path from server to
> kernel, resulting in more efficient system.
Except, while that's often what the blurb says, the implementation is
typically paste in whatever somebody else wrote and call it a day.
For example years back Haiku announced it was going to have WiFi. Its own WiFi
stack, far more integrated and reliable than things you were using on other
systems. Very exciting.
And then as a "stop gap" it pasted in FreeBSD's PCI 802.11 driver subsystem,
and the WPA Supplicant you've seen on all your Free Unix systems. And that's
where they stopped. Last time I checked they had a version of WPA Supplicant
vulnerable to KRACK but I'm sure they'd updated for the Beta.
How does their "integrated" solution handle your WiFi credentials? They use
the powerful Haiku "KeyStore" password protection system. Here's how that
works: You write the plaintext password into a file. Don't worry though, in
KeyStore they wrote "TODO: Actually encrypt". Which I think you'll agree is
almost as good as using encryption ?
> BeOS core API was designed for multi core systems from day one, so the
> design is more modern than older systems.
More modern is relative. "Older systems" in this context is stuff from the
1980s, because BeOS is roughly the same age as Linux, started in 1991.
However in Haiku you won't find anything like IO Completion ports, or futexes.
You won't find any privilege separation because everything runs as "root".
Everybody else moved on and Haiku is stuck being really quite good by 1990s
standards.
~~~
eloisant
Well, BeOS is the same age as Linux but Linux was an implementation of older
concepts (Unix) while BeOS was designed from scratch.
~~~
coldtea
That was 1991 Linux. Modern linux has moved much further than the original
implementation, introducing all kinds of non-old-time-UNIX concepts (starting
from Plan9 ones).
~~~
pjmlp
It still needs to catch up with mainframes' security, though.
~~~
yjftsjthsd-h
In _general_ I don't know enough to disagree, but keep in mind the context
here is comparing to Haiku, which is _not_ great at security, particularly of
the multi-user sort to which you refer.
~~~
pjmlp
There is a Multics security research paper that describes out-of-bounds as
practically non-existent thanks to PL/I features.
According to the Linux Kernel Security Summit 2018, 68% of kernel exploits are
caused by out-of-bounds errors.
That is just one example, there is plenty of material available to anyone that
wants to dive into the subject.
So just because GNU/Linux faires better in security towards Haiku, doesn't
mean there is nothing left to improve.
------
sitkack
I just had an idea to use KVM and PCIe passthrough to piggy back windows
driver support for Vulkan into a PCIe device running in Haiku so Haiku could
use the latest graphics cards w/o having to write their own video drivers.
Bonus points for implementing mmap across to guest OSes. There should be a
generic hardware description for a Vulkan based paravirt PCIe graphics card.
Then the backend of the pvirt GPU could either pass it through to the driver,
send it over the network or re-translate it on the fly to AVX, WebGL, etc.
Because the first question that popped into my head was, "How do they enable
modern GPU support?"
~~~
waddlesplash
We already have a roadmap for that, see here: [https://discuss.haiku-
os.org/t/plans-for-3d-acceleration/727...](https://discuss.haiku-
os.org/t/plans-for-3d-acceleration/7272/8)
We don't want to be confined to VMs forever; in fact we already have a
sizeable number of users running on bare metal.
------
artsyxxx
Can this run on VMWare?
~~~
kome
I am running it on VMWare and it works like a charm. I am writing from
WebPositive, the integrated browser, and it's prety awesome.
------
gigatexal
I still think Apple should have bought NEXT and BeOS. Can’t wait to try this
out though. Loving the idea of virtio drivers
~~~
chungy
They tried buying BeOS, but the BeOS people absolutely refused to sell out to
a company like Apple. NeXT was their second choice, and they succeeded in
buying it (NeXTSTEP became Mac OS X).
~~~
chiph
IIRC, Jean-Louis Gassée named a price that Apple was unwilling to pay. I don't
know his motivation for doing that - was it all about the money? Or was it to
stick it to Apple and name a stratospheric price as a way of sending a
message?
~~~
tialaramex
Jean-Louis Gassée is an ex-Apple executive. He's why Jobs left Apple. For him
the point of Be Inc. was to show Apple they'd been wrong to subsequently get
rid of him in 1990.
When Apple were looking at BeOS / NeXT what they were really trying to decide
was whether to have Jobs or Gassée back. They picked Jobs.
After Be, Inc. sank JLG arranged to run Palm's software spin-off. This
produced the version of PalmOS that's remarkable because instead of being used
more palmtop devices than previous PalmOS versions it was used in none at all,
even the Palm hardware company never used it.
IMO Gassée would be nobody you've heard of if not for the fact of his being in
a position to force Jobs out of Apple. Everything else follows from that.
------
jernfrost
Loved a BeOS when it came out but is not the whole concept outdated now? C++
all over the place seems like a bad idea for enabling the usage of other
languages as C++ is terrible at interoping with anything. Multi threading all
over the place BeOS style does not seem like way concurrency is done today
with higher order APIs.
~~~
erikpukinskis
Sure it’s probably outdated but that doesn’t mean it shouldn’t exist. A
Porsche 911 is outdated but it has its place in our lives.
------
dmitripopov
17 years in development. Just WOW!
------
Fnoord
Any way to get 802.1x working?
~~~
waddlesplash
Yes, it's supported out of the box using FreeBSD's WiFi drivers, so it should
support most chipsets FreeBSD does.
~~~
Fnoord
Does PEAP work as well?
| {
"pile_set_name": "HackerNews"
} |
Why programmers can’t make any money: dimensionality and the Eternal Haskell Tax - gu
http://michaelochurch.wordpress.com/2014/06/06/why-programmers-cant-make-any-money-dimensionality-and-the-eternal-haskell-tax/
======
PaulHoule
Oh yeah, but it even affects Java programmers.
If you want a job working with Lucene 3.6, it is easy. There is always some
recruiter who knows these guys who have an opening because the last programmer
burned out and they need you in a hurry because the product was supposed to
hit the market 2 weeks ago.
It doesn't take great interviewing skills to win this job because everybody
involved wants to stop putting out fires and if you can avoid sexually
harassing somebody or making a jewish joke in front of somebody named Cohen
you can get the job.
If you are so foolish to accept the mission you'll get issued a bottom-of-the-
line "business" laptop from Dell which is a hand-me-down from a salesman who
couldn't sell anything. You'll find the servers are bursting at the seams even
though the system isn't in production yet. And then you'll find that your team
has smashed Lucene 3.6 in ways that devastate performance and that are very
hard to maintain. You can't get a straight answer from the lead developer as
to how to build the system, never mind anything else.
Now Lucene 4 uses half the memory of Lucene 3 because it avoids the UCS-16
tax. Many operations are sped up up to hundreds of times. The code base is
easier to work with and wouldn't have required to have the violence done
against it that had been done against Lucene 3.
If search performance matters, they'll get smoked by a competitor who uses
Lucene 4, so I feel it is malpractice to work on Lucene 3 projects.
| {
"pile_set_name": "HackerNews"
} |
Plano, Texas ranked as best city for building personal wealth - procrastitron
http://money.cnn.com/2008/06/30/real_estate/personal_wealth/index.htm?section=money_topstories
======
aggieben
Interesting title: _small-town life may mean big money_.
Plano is anything but a small town, at least by the American perspective. It
is the quintessential suburb of a top-ten city, and thoroughly enmeshed in the
metropolitan area. Only someone from one of the top five cities could think of
Plano as a "small town". Consider the 5A high-schools: my brother graduated
from Plano Senior High School in 1984 in a class of 1200. I hear they are even
bigger now. That school is not the only one at that scale, and it would be
considered 6A in some states.
Having said that, I think Plano is one of the 3 cities in the Dallas Metro
area that have a decent shot at becoming a technology hotbed that resembles
the well-known ones (The Valley, Boston, Austin). Plano actually resembles
Austin in some ways.
Rockwall and Richardson are the other two Dallas suburbs that I think make for
decent technology areas. We'll see how that shapes up.
| {
"pile_set_name": "HackerNews"
} |
Add '127.0.0.1 xn–9q8h' to /etc/hosts gives you "localghost" - DyslexicAtheist
https://twitter.com/rfreebern/status/1214560971185778693
======
jackewiehose
It took me a while to get the joke because I didn't recognize that white ghost
on a white background. I take this as another example why these non-ascii urls
are a bad idea for security.
~~~
icebraining
It's not white, it's the blueish grey #E1E8ED. You need a better monitor :)
~~~
paulddraper
Fails all contrast measures abysmally: [https://contrast-ratio.com/#%23E1E8ED-
on-white](https://contrast-ratio.com/#%23E1E8ED-on-white)
And this is despite that fact that the actual "font color" is #14171A with
excellent contrast: [https://contrast-ratio.com/#%2314171A-on-
white](https://contrast-ratio.com/#%2314171A-on-white)
But these bizarre characters don't respond the text color and thus utterly
unpredictable in their legibility.
~~~
icebraining
The use of the color is quite different from a regular character, so measuring
by the same yardstick makes no sense. This is a blob of color, not just some
thin lines.
> these bizarre characters
At least on Twitter.com, they're not characters, they're SVG images.
~~~
paulddraper
PNG, but yes.
~~~
icebraining
Maybe it differs based on the browser? Shows up as
[https://abs-0.twimg.com/emoji/v2/svg/1f47b.svg](https://abs-0.twimg.com/emoji/v2/svg/1f47b.svg)
to me.
~~~
paulddraper
[https://abs.twimg.com/emoji/v2/72x72/1f47b.png](https://abs.twimg.com/emoji/v2/72x72/1f47b.png)
¯\\_(ツ)_/¯
------
shawkinaw
This is using Punycode encoding, see
[https://en.m.wikipedia.org/wiki/Emoji_domain](https://en.m.wikipedia.org/wiki/Emoji_domain).
~~~
jackewiehose
Why is this even a thing?
~~~
tialaramex
There are a bunch of different human writing systems. All of them are weird
because they were invented by humans, most of them are _very_ weird indeed
because they were invented by humans a long time ago and then gradually
mutated.
The Latin system is the one you're using here. It's very popular. Most humans
in the integrated tribes are somewhat familiar with it‡. It has twenty six
"letters" and then twenty six more "capital letters" which look different but
mean almost the same thing for some reason, and then a bunch more symbols that
aren't quite letters although some (apostrophe, ampersand) have a better claim
than others. But other popular systems include Han, which has a shitload of
logograms, and Cyrillic and Greek which have different letters than Latin and
different rules about how letters work.
Anyway, the people who invented DNS only or primarily used the Latin system
and they weren't much into capital letters. So, their system doesn't treat
capital letters as different and only has one set of twenty six Latin letters,
ten digits, a dash and an underscore for some reason.
But, lots of people who do NOT have Latin as the primary or only writing
system found this annoying to work with. They wanted to use their writing
system with DNS especially once the Web came along and popularized the
Internet.
Punycode is a way to use some reserved nonsense-looking Latin text in any DNS
label to mean that actually this DNS name should be displayed as some Unicode
text. Unicode encodes all popular human writing systems (and lots of unpopular
ones) fairly well, so this sorts the problem out. Specifically Punycode
reserves Latin names that start xn-- for this purpose. By only caring about
display this avoids changing anything in the technical underpinnings. Only
user-facing code needed to change, every other layer is unaltered.
The rules about IDNs say that a registry (such as .com) should have rules to
ensure the names in that registry are unambigous and meaningful. But in
practice the purpose of the .com registry in particular is to make as much
money as possible regardless of the consequences. So you can register any name
you please and they won't stop you even if it's deliberately confusing.
‡ None of the extant unintegrated tribes have writing. This probably doesn't
mean anything important.
~~~
downerending
Perhaps ironically, URLs were never meant for human consumption in the first
place. You were meant to "travel" to various sites via search indices, etc.
(Think Google.)
Viewed in that light, restricting DNS names to ASCII as a way to reduce bugs,
security issues, etc., makes a lot of sense.
~~~
ncmncm
Search engines happened remarkably late. Nobody understood how easy search
engines would be.
People thought you would follow links from directory pages linked from your
"home page", hence the house symbol still seen in browsers. Yahoo! is a
leftover of an attempt at a directory.
~~~
downerending
Indeed--this is what was meant by "indices".
It's kind of like XML. It was never meant to be seen by human eyes, except in
a few debugging scenarios. Unfortunately, that intention was ignored, and now
we have a usability disaster. (At least in the case of XML, it can just die.)
~~~
lokedhs
I agree with what you said about XML. But in its stead we have a lot of JSON
which is several respects is even worse.
The worst part of it is that it doesn't have a stable definition for numbers,
making it impossible to guarantee you're getting the same value back if you
encode and then decode a number. Reliably preserving the data you serialise
using it should be a primary feature for an encoding format. JSON can't even
do that.
~~~
gowld
Why do you need a stable serialization for an unstable data type? Use a string
of you want stability.
~~~
lokedhs
The point is that a 64-bit integer is stable in the language I'm using (which
is most languages).
My opinion is that a serialisation format that explicitly makes something as
fundamental as the representation of numbers unspecified is not useful as a
data representation format.
------
IHLayman
I hate to be a spoilsport, but this is a good reminder that
[https://en.wikipedia.org/wiki/IDN_homograph_attack](https://en.wikipedia.org/wiki/IDN_homograph_attack)
is still possible in some domains and with some browsers and CLI tools,
although some of the easier tricks to detect have been mitigated.
~~~
DyslexicAtheist
if you run pihole or a local dnsmasq/unbound it should be possible to mitigate
it by sinkholing any unicode domains, e.g. with dnsmasq (requires a patch
[https://github.com/spacedingo/dnsmasq-
regexp_2.76](https://github.com/spacedingo/dnsmasq-regexp_2.76)) you can do
this:
address=/:xn--*:/0.0.0.0
does anyone know if something like this is possible with unbound?
~~~
rahuldottech
But... What about legit unicode domains? I own a couple that I use for
personal projects or file sharing.
~~~
kylek
I've never seen a legit unicode domain personally, but, is this not compatible
with whitelisting specific domains? (in pihole, anyways...)
~~~
LadyCailin
I have, but I live in Norway, where we have æøå in the standard alphabet. I
suspect it’s more common still in Asian countries, because at least in
Norwegian, there are standard ascii replacements for all the extra letters, å
= aa, ø = oe, æ = ae
~~~
Nullabillity
Here in Sweden I have never encountered a single legit IDN domain.
~~~
Symbiote
I know [http://www.xn--sknetrafiken-ucb.se](http://www.skånetrafiken.se), but
like many it just redirects to an ASCII version. (Does it look weird seeing
"Skane" when you know it ought to be "Skaane"?)
Similarly for a power company, [http://xn--rsted-uua.dk](http://ørsted.dk),
just a redirect, but they do use it on adverts and my electricity bill.
Some that don't redirect: [http://xn--mgk--jra.dk/](http://mgk-ø.dk/)
[https://www.xn---strm-uuae.dk/](https://www.ø-strøm.dk/) [https://xn--
magnusbrth-85a.se/](https://magnusbråth.se/)
(HN has converted the displayed URLs to Punycode, presumably as a quick
security measure without reference to the reasonable characters for each TLD.)
------
billpg
I wanted to make a new website using emojis instead of "www" as a joke about
the number of syllables. ("Angry Face Angry Face Angry Face" takes the same
amount of time to say "www".)
Browsers kept insisting on showing this as xn--b38haa.crankybill.com, so I
went with "grr" instead.
------
banana_giraffe
Interesting, both Chrome and Firefox seem to show the Punycode encoding after
I enter the emoji in the URL for me.
Do browsers always show the Punycode encoding, or do they show the encoded
glyphs only in some scenarios? I can't find examples of Punycode in the wild
used by normal websites.
~~~
londons_explore
I believe the config of which glyphs to show depends on the TLD. There is a
hardcoded list of which character ranges are acceptable per TLD, and if any
characters are outside those ranges, the xn-- form is shown instead.
------
AndyMcConachie
Not all unicode characters are valid IDN labels. For example, emojis are not
valid in IDN labels.
~~~
daxterspeed
Valid or not several emoji domains have existed since 2001
[https://en.wikipedia.org/wiki/Emoji_domain](https://en.wikipedia.org/wiki/Emoji_domain)
~~~
WorldMaker
As with so many things domain name related, what is or is not valid varies by
and is determined by registrar. The biggest registrars (.com, .net, .org as
three examples) generally have a lot of restrictions on IDNs, whereas many
countries can afford to allow just about the full gamut of Unicode if they
wish.
------
netsharc
Hmm, I wonder if that's going to be the next battle field for URLs: Facebook
will try to register its logo as an emoji, and you'd just need to go to
[http://[f]](http://\[f\]) to open their site.
There already is an emoji for apple (the fruit, not the company). Oh the
horrors. I should start an emoji NIC!
~~~
manifestsilence
These would be less than convenient to type, but perhaps as we go more and
more towards a non-typing web where a walled-garden start page and predefined
links lead to the most popular sites with a click, these URLs will become
fashionable. I think if so, this will herald the impending death of the human-
read and typed URL in favor of start page links and search results.
~~~
kick
There are more mobile users than there are desktop users, and for them it's
just the same to type.
~~~
htfu
How is switch to emoji input -> press search box -> start typing apple ->
press apple symbol and so on “just the same” as app<CR>?
~~~
kick
You don't have to search. Just hit the apple emoji. If you use it frequently,
it'll probably be on the front. If not, it takes two seconds to hit the
category it's in and then press it.
~~~
htfu
Because apple happens to start with an a. Ok fine, specifically for apple.com
it's almost the same. But that's not really the point I was arguing.
------
nottorp
And how do i type that in the address bar?
~~~
billpg
If you use Windows, [Logo]+[.]
~~~
wongarsu
That's a neat trick. It even has kaomoji and useful symbols ╰( _°▽°_ )╯
~~~
Avamander
This sounds like the modern version of this
[http://bash.org/?835030](http://bash.org/?835030) quote.
------
markandrewj
Ghost in the shell...
------
huxflux
This made my day!
| {
"pile_set_name": "HackerNews"
} |
How to match regular expressions faster than Perl - neilc
http://swtch.com/~rsc/regexp/regexp1.html
======
thorax
Old, but good. I've done my own benchmarks when I noticed some of our users
seeing 1-2 second delays on their custom regular expressions. It did the same
complex queries in milliseconds.
I highly recommend this, especially for the next revisions of Perl, Python,
etc, etc.
| {
"pile_set_name": "HackerNews"
} |
Graeme Hackland on the Evolving Role of the CIO - arabadzhiev
https://channel9.msdn.com/
======
mindcrash
That's the link to the conference livestream. The actual recorded session is
here:
[https://channel9.msdn.com/Events/TechNetVirtualConference/Te...](https://channel9.msdn.com/Events/TechNetVirtualConference/TechNetVC2016/Day-1-Graeme-
Hackland-on-the-Evolving-Role-of-the-CIO)
~~~
arabadzhiev
Thanks for putting that up :)
| {
"pile_set_name": "HackerNews"
} |
The myth of ‘mad’ genius - baddash
https://aeon.co/essays/is-there-any-evidence-linking-creativity-and-mood-disorders
======
commandlinefan
I've always wondered if a lot of "eccentric" people aren't just behaving the
way everybody would naturally behave if they could get away with it. I say and
do a lot of things because I have to if I want to have food to eat and a place
to sleep, but I'll never know how different my behavior _might_ have been if I
were rich enough or brilliant enough that people would just put up with
whatever I happened to feel like doing at any given moment.
~~~
majos
On a related note, I think this idea explains the appeal of smart jerk god
characters (Doctor Who, Doctor House, Rick Sanchez, Sherlock Holmes...) to a
sizeable portion of nerds (including me). "So smart they can't help but need
you" is not a healthy goal, but damn if it isn't seductive.
~~~
Ntrails
To me it's not so much "So smart they can't help but need you", as it is
"being right is more important than anything else". The world doesn't work
like that (often), but it's an attractive ideal in some ways
------
toomanybeersies
As much as I like to joke that the greatest artists were all deeply flawed
people, so I'm trying to become an alcoholic arsehole in order to become a
better author, as the article states, there are a lot of good and great
artists/creatives who are not substance abusing wife beaters.
I think there are a couple of factors at play here. The first is that you
never hear about how normal an artist was. Nobody talks about how Ansel Adams
didn't beat his wife and wasn't an alcoholic, because that's just normal. We
only talk about Hemingway's alcoholism, or Van Gogh's mental condition. So
we're conditioned to believe that artists are flawed people.
The second factor is that people want to justify why they're not a great
artist. "I'm not a great writer, but at least I don't emotionally abuse
everyone I know and I'm not a raging alcoholic".
~~~
darawk
That's a reasonable theory, but it doesn't explain why this phenomenon is
(somewhat) unique to art. The explanation you gave applies equally well to
say, Basketball players and business tycoons. And for the most part, there's
no social wisdom i'm aware of that says you have to be a drug addicted asshole
to be one of those things.
~~~
toomanybeersies
I think that it does apply to sports, it's a common trope that athletes
(especially in contact sports) are domestic abusers. Business tycoons are
often stereotyped as sociopaths.
Plus there's doping in athletics. People love to criticise Lance Armstrong for
doping (among his other flaws), but there's no way, even with doping, that the
average person would be able to compete in pro cycling, all the steroids,
clenbuterol, EPO, and other PEDs in the world couldn't dope you up enough to
be competitive at that level.
People love to justify their mediocre existence (not that there's anything at
all wrong with mediocrity). They could be a businessman if they were more
sociopathic, they could be an athlete if they used steroids, they could be an
artist if they took drugs or were more eccentric.
~~~
watwut
Then again, there are people who dropped out of sport, because they thought
they would have to take doping or were pressured to. There are people I know
personally who refused jobs that required them do something they found
unethical - despite higher pay and benefits. It is profoundly unfair to label
them as mediocre just because they have moral or ethical lines they don't
cross. Doing that is nothing but rationalization of bad acts.
Maybe the _average mediocre_ person is not just _justifying their mediocre
existence_.
Maybe you are rationalizing bad acts bad on idea that being socially
celebrated matters more then ethics and moral.
~~~
krageon
Being socially celebrated gets you all the things people teach you mean
success as you're growing up: Money, probably a beautiful wife, the admiration
of your peers. I agree with you that doesn't have to matter but I don't think
you should underestimate how much that flies in the face of what most people
have been taught matters most.
~~~
watwut
Which is why I am objecting against knee jerk labeling of people who resisted
the temptation as "mediocre" and knee jerk celebration of those who did not
resisted as "the others are mediocre anyway". It is tempting even without
that.
The beautiful wife will get old as any other women, unless you exchange her
ever few years. Being wife merely as a social status trophy is not the kind of
relationship I would find attractive at all. I know that some girls are raised
to believe that is what they are supposed to be, but that is another category
of things I would rather avoid.
------
Semirhage
The article focuses exclusively on mood disorders, which frankly isn’t what
most people think of or mean by “mad” in any case. Nikola Tesla for example
was mad as a hatter, but it appears to have been a delusional disorder. While
depression for example can be psychotic, it tends not to be, and I think
psychosis or at least delusional beliefs are the hallmark of what is commonly
meant when the public talks about “madness.”
~~~
eigenstuff
You're on the money. Tesla, Emily Dickinson, and likely Van Gogh all had
schizotypal personality disorder. Which is not so much a personality disorder
in the sense most people think of them so much as it's the Aspergers analog of
the schizophrenia spectrum. I've seen it suggested that Einstein may have had
it, which I find a lot easier to believe than the notion that he had Aspergers
given that his son had schizophrenia and these things tend to run in families.
(He may well have had nothing, though.) I strongly suspect that Edwin Land,
the inventor of Polaroid, was schizotypal.
There IS actually a proven connection between schizotypal traits and
creativity where there really isn't with mood disorders (except bipolar to an
extent), since you're "mad" but not so much so that you can't function well
enough to execute your ideas. It's worth googling about.
------
jpeg_hero
Article and “meta-research” distinctly not related to genius.
Genius is a distinctly one-in-a-million phenomenon, this is about people of
above average creativity and how they relate to those of below average
creativity. And very dubious categorization at that.
~~~
aje403
I don't know why you're getting downvoted, you're right (although the
definition of 'genius' is always up for debate). This is just one of a million
opinion articles on the same questions which happens to have a regression line
behind it.
------
TangoTrotFox
Speaking of the stereotype of 'mad genius', as opposed to the eccentric
creative which this article is about, I would hypothesize that 'mad' is simply
a mislabeling of the fact that those who are more intelligent are _generally_
going to be less guided by social norms and more by their own logic and views.
Being able to follow your own logic, without bounds, is something that is most
people, for some reason, do not tend to do.
Einstein is the best example of this. Relativity, and its implications, are
intuitively insane and absurd. Yet his logic led him there and he invested an
immense amount of effort and energy trying to prove it. And it turned out he
was correct. In a parallel universe where the laws of physics are more sane,
Einstein would have been labeled as insane for even imagining such an 'absurd'
idea might be reality.
------
salty_biscuits
James Joyce said
"To say that a great genius is mad, while at the same time recognizing his
artistic merit, is no better than to say he is rheumatic or diabetic"
His daughter had mental health issues. She went to see Jung and the exchange
allegedly went
“Doctor Jung, have you noticed that my daughter seems to be submerged in the
same waters as me?” to which he answered: “Yes, but where you swim, she
drowns.”
------
swayvil
I used to be nuts. And you know what, I went with it.
I could focus on a project 24-7-365. That's how great projects get done. And
people who get great projects done are what we call "geniuses".
I'm better now. Sometimes I think of the great projects that I could do but
then I think, "no thanks, life is bigger than that".
~~~
kizer
Good that you're better. Maybe try to work lightly (2 hours a day) on a
project someday and see if you can complete it without herculean focus or
dedication.
~~~
rabidrat
It's not like that, in my experience at least. I can do amazing things in 6
weeks, but it involves eating, breathing, and sleeping the
system/problem/project. It turns out there are 1000 hours in 6 weeks
(including sleep time), and if you have the problem loaded into your brain as
its 'default mode network', then it takes over and you can get massive cross-
functional efficiencies (including dreaming up solutions in your sleep).
Working the equivalent for 2 hours/day would take years and does not generate
anywhere close to the kind of energy or output that the parent is referring
to.
------
ggm
An article about the status/role/context of insanity and society which doesn't
mention Foucault's work? Published in the year of my birth (1961), which I
like to think is a coincidence...
Foucault M. History of Madness. Khalfa J, editor, translator & Murphy J,
translator. New York: Routledge; 2006. ISBN 0-415-27701-9.
------
LifeLiverTransp
Yes, why should a shizophreniac who wildly connects everyday experiences to
form strange conspiracy theories would have a natural advantage when it comes
to wildly recombining seemingly unconnected ideas, to leap over boundaries
every 100 recombination?
Of course, if there is a birth advantage there- all the meritocratic idealism
and work wont get you or your kids there.
So its very very anti-equalizism. Its okay, though, if that "benefit" messes
up someone elses life and strands him/her living in a box. That is just how
the world is supossed to work. Sick people must suffer, if they do not fit
into the world tailored for average people by average people. No sense in
protesting gods wanted order. Move along.
~~~
darkmighty
Have you ever experienced or met someone with schizophrenia?
It is hell.
Not only for them but often for those who care about them. It's not about a
rosy, whacky persons that the society whimsically outcasts. It's often
legitimately tortured people that almost always need serious treatment and
constant care, otherwise they risk falling into a very dark abyss.
They often wind up hobos or sometimes in prisons/asylums because they don't
have treatment and a very supportive family available, falling in disrepair,
not because an evil plot of society.
If this condition were the price to pay for breakthroughs, I'd consider the
price too high. But indeed it isn't -- there are no examples of seriously ill
paranoid schizophrenics that put out meaningful work. It's a massive
hindrance, fogging your view of the world and yourself. Often cited cases like
John Nash actually stopped being able to conduct any good work once the
disease got hold of them. There are countless examples of extremely brilliant
people productive through their lives that never showed signs of delusion, and
almost no cases of delusional persons doing good work.
This disease largely embodies the obscurantism, mysticism, fear, distrust,
that scientific enlightenment was idealized to fight against. I can't wait for
its root causes to be found and it completely eradicated (along with MS,
Alzheimer's Disease and Dementia, please).
~~~
Mediterraneo10
> there are no examples of seriously ill paranoid schizophrenics that put out
> meaningful work.
I daresay Adolf Wölfli is one counterexample. Institutionalized for most of
his life, beleaguered by paranoid delusions (some of which found their
expression in his work) but it was under the circumstances of that illness
that he produced the art he is so acclaimed for.
------
fladrif
Creativity seems to stem from being outside of the "mainstream" mindset, of
going down paths not taken and trying unconventional strategies. Most of the
time these don't work out putting an emphasis on the tried and true, but when
it does it get labeled as a spark of genius, of creativity. I think it takes a
certain type of person to continually beat down these paths and blaze
unconventional trails and may be a mark of a 'mad' man/woman, but it may just
take those kinds of people to not try and conform with conventional wisdom.
------
cbanek
It's interesting that there was no talk of anxiety. In talking about the
availability heuristic, which is refereneced a lot in Daniel Kahneman's
_Thinking Fast and Slow_, they also talk about how anxiety tends to activate
System 2 (which is the one for deep thinking), as opposed to the intuitive and
nearly autonomous System 1.
Creativity and intelligence are considered different, but related, and
Kahneman talks about how the activation of System 2 allows people to make more
well informed (smarter) decisions.
------
askl56
I've always believed that "genius" is simply mental illness with an audience.
People such as Mozart were known for feeling nervous anxious etc, until they
wrote music which was cathartic, which nowadays I'm sure would be diagnosed as
some sort of mental illness.
~~~
themodelplumber
This is part of the reason why "mental illness" is a broken term for certain
applications. Unevenly-matched skill (coping, etc) vs. circumstance (group
psychology, social standards) is one alternate POV that better explains why a
"mentally-ill" person could create genius works. They have become conditioned
to exploring extremes by dint of circumstance and personal history; their
liabilities, struggles, and even periodic victories in these contexts allow
them to produce works the likes of which are rarely seen or even contemplated
in "non-genius" company. The audience knows intuitively that they are
experiencing some fundamental truth, though they cannot in a small number of
steps arrive at the sequence that allowed the truth to emerge.
~~~
humbleMouse
That last sentence is profound, nice comment +1
------
8bitsrule
_The Romantic stereotype that creativity is enhanced by a mood disorder_
That's postulated as a given. If it's not a given, then it's a red herring.
The author supplies no evidence that it's a given. As far as I got, the phrase
'mood disorder' was undefined.
_... But is there any scientific reason to believe in a connection?_
Science doesn't believe, science constructs and improves models based on
repeatable observations. That which cannot be observed cannot be modelled.
People can choose to 'believe' those models ... which is 'faith'. Which
science was invented to get away from.
So in the first two paragraphs, the author prepares us for the illucid neo-
phrenology which follows.
~~~
carlmr
>Science doesn't believe, science constructs and improves models based on
repeatable observations.
That's confusing the scientific ideal, with how actual scientists operate.
Actual scientists hope that they get something right, they feel strongly about
their research like the mother of a child, they're just as clouded by emotions
as any other human being. And if you add corruption into the mix, then yes,
what we call science is not as solid as it looks, but it still provides useful
results sometimes.
But we can still talk about a scientific reason to believe. Because the reason
might be scientific, but it still might be something which we can believe or
not. Because scientific reasons are about as flawed as their creators.
The replication problems in many fields are evidence that science is only as
ideal as the people producing it.
------
majos
I don't get this article. As far as I can tell, it wants to prove (in spite of
the headline) that mental illness and creativity are not really correlated.
Then it goes on to say that measuring either mental illness or creativity is
hard on its own -- in which case the meta-analysis done isn't terribly useful?
My own unempirical take on genius is it's not so much "you need some insanity
to have groundbreaking genius-level ideas" but "some people are so into doing
thing x that they will choose to do it almost all the time, and some of these
people have talent and luck in thing x too, which is a potent combination that
looks like what we'd call genius".
~~~
taneq
Well, the general take is that average intelligence combined with well-above-
average drive (motivation, work ethic, etc.) has significantly better results
than "smart but lazy". So you'd expect someone who's smart and also fanatical
about something to get outstanding results in that field. Monomania also
generally comes across as 'mad'.
------
dbxz
I taught gifted education at a prestigious middle school for five years. Over
300 kids came through my classroom during that time.
I saw it all. Seizures. Fistfights. One girl was pretty sure there were secret
messages from the principal in the tests I gave. Lots of them started taking
drugs early too.
The culmination happened in the Fall of 2012, when one girl convinced her
friends that, if they killed themselves, they'd wake up already graduated from
college, with school behind them. Luckily, they didn't succeed. And so, in a
class of 30 people, I had 6 out and in a mental hospital.
Mad geniuses. They're real.
~~~
toomanybeersies
I used to go to a gifted education class once a week when I was around 13
years old.
Probably half the class or more had some form of condition, Autism, ADHD,
depression, etc.
Looking back at it now, I don't think it was the case that gifted people are
generally "mad". I'm a fairly normal person, and so were a lot of other people
in the class. I think that rather, it was that these kids didn't integrate
well into conventional education, but this gifted education class was very
understanding and accepting of people's "quirks", and had a much more open
learning environment where people could grow in their own way. I ended up
there because I struggled with the structured nature of normal school.
I've met plenty of people over the years who could be considered gifted (in
this case, top 5% IQ), most of them are completely normal and probably just
dealt with the overly structured nature of normal school.
------
uxhack
This article is trying to rebute David Horrobin idea that if everyone is
creative then the world will soon be become crazy. It takes a geniuses to see
a stick as a weapon. If Everyone starts becoming creative then the whole
society will quickly devenigate.We need slme order. Horrobin arguement goes
against the idea that everyone can be a creative genius. What is interesting
about his idea is that how universal craziness is amongst a small proportion
of the population, which means it is has been around a very long time.
~~~
Nasrudith
Reminds me of a differing but related idea. Society at large is already crazy
but nobody notices because it has been so normalized being rational is looked
upon as crazy.
The promoter of doctors handwashing wound up instituionalized. And it isn't a
in the past problem. I mean look at interviewing etiquette and ideals for one.
They distrust people who can't or won't put on a mask convincingly enough as
untrustworthy. Not even as "unsuited for a job where it is relevant like sales
or acting". That is frankly barking mad to only trust those capable of faking.
| {
"pile_set_name": "HackerNews"
} |
Bacterial cells are small 'eyeballs', scientists discover - Marinlemaignan
http://www.sciencealert.com/bacterial-cells-are-actually-the-world-s-smallest-eyeballs-scientists-discover-by-accident
======
marshray
* Scientists knew bacteria could sense the direction of light.
* Scientists had been looking at bacteria under a microscope for 340 years.
* Just the other day, someone noticed how bacteria focus light.
If you ever think there's nothing interesting left to discover in your field,
read this again!
------
proc0
Hmm, that diagram is a little misleading. It's not like the bacteria are
processing the images in any way, right?! They are merely detecting light and
its direction so they can swim that way. Someone correct me if I'm wrong, I
guess, but I'd by extremely surprised if there was any sort of image
recognition at the cellular level.
~~~
VeilEm
Eyes don't process information, they're a sensor. In an animal its brain, a
multi-celled organ, processes the information.
~~~
gnaritas
Actually they do, and beyond that, the retina contains neurons that do the
basic processing before travelling down the optic nerve to the brain. The
retina is also basically brain tissue and is part of the central nervous
system.
~~~
pygy_
Indeed, among other things, they detect edges, movement, and preprocess color
from RGB to something conceptually closer to YUV (with color encoded along two
contrast axes: red-green and blue-yellow).
[https://en.wikipedia.org/wiki/YUV](https://en.wikipedia.org/wiki/YUV)
------
adinb
If we can figure out some way to capture and transmit what these (and possibly
other bacterial species see), we could have trillions of little cameras
everywhere, possibly changing everything from medicine (deliver medical
payload based of off visual information or image tumor directly without
surgery) to biology (pseudo xenobiology studying the underwater volcanic mats
of bacteria and algae) to even intelligence gathering (need to spy? Grow some
bacteria in an air vent or other convenient area).
I hope that someone is starting to try and capture images from those mini
eyeballs, they then become our mini bio-cameras.
------
spatulan
Seems similar to this
[http://www.sciencedaily.com/releases/2015/07/150701133348.ht...](http://www.sciencedaily.com/releases/2015/07/150701133348.htm)
------
aubergene
The diagram showing the DSLR with the image the right way up as opposed to the
eyes, is misleading as the camera is also captured upside down and then
flipped in software.
------
norea-armozel
I thought it was already established that the genes for photosensitivity led
to the formation of eyes in all animals. And that the proposed benefit in the
early Earth's oceans was to keep bacteria from frying themselves if they got
too close to the surface (due to there being no ozone layer way back then).
So, I'm not sure if this is all earth shattering knowledge at this point, but
it's still a fun read.
------
plumeria
"But it's a vital trick for the bacteria. Without sensing light and moving
toward it, the organisms wouldn't be able to photosynthesise, which has been
crucial to their survival since time immemorial."
Can this be applied somehow to treat bacterial infections?
------
swayvil
A mass of a million bacterial eyeballs probably resolves images pretty well.
Maybe better than us.
~~~
stephengillie
If you subscribe to the belief that we're basically a bacterial mat of about
37 trillion [1] cells, each extremely large and complex on its own - it's not
too large a jump to interpret the eye as being a 1 billion cell "bacterial
sub-mat" that does primitive image capture to support the overall bacterial
mat.
[1] [http://www.smithsonianmag.com/smart-news/there-
are-372-trill...](http://www.smithsonianmag.com/smart-news/there-
are-372-trillion-cells-in-your-body-4941473/)
~~~
meric
That would make us hiveminds.
~~~
justinjlynn
You think you decide to be hungry? In a very real way, we are.
~~~
bbcbasic
This freaks me out! Whoever me is? :-o
------
shultays
Billions of billions bacterials are watching ours every move. This have a
movie potentials
| {
"pile_set_name": "HackerNews"
} |
JavaScript and Node performance coding tips to make applications faster - osopanda
http://voidcanvas.com/javascript-performant-coding-tips/
======
jcranmer
The quality of this article is readily apparent in the first recommendation.
Global variables are slow... because of having to walk scope chains? Uh, no.
Scopes are static properties (well, once the with statement was eliminated--
now you know why that doesn't exist anymore): the compiler only has to look up
the scope of the variable during compilation, which means it carries no
penalty by the time you get to a JIT. The reason why global variable lookups
can be slow is that global variables are often properties on objects with non-
trivial access costs (such as a DOM window, if you running on a webpage), as
opposed to an inaccessible lexical environment where all references are
statically known.
(It's also somewhat disingenuous that all of the coding tips are about keeping
v8 happy, as if v8 were the only JS JIT that mattered or v8 characteristics
were somehow universal in JS JITs, neither of which is the case).
Running through the some of other bits:
* I'm not sure homogeneous arrays is an optimization benefit for other JITs than v8.
* Actually, traditionally JIT compilers focus on hot loop nests as well as functions. The original TraceMonkey JIT (first JS JIT) considered loops a starting traces, for example.
* Changing shape of objects does matter for monomorphic calls. Although not all JITs are as insistent on monomorphism as v8.
* True, but isn't this obvious?
* Last I checked (admittedly quite a while ago), JITs tended to optimize based on shape, not class. So there's no difference in performance between the two cases.
* Or you could use for/of. Or you could just never use sparse arrays because no JIT likes having these holes. I thought you were giving performance tips, it seems weird not to mention one of the more universal ones here.
* Try/catch is generally free if you don't throw an exception.
I could go on, but it's not worth it at this point...
~~~
thealfreds
The try/catch wasn't always that case. I'm having trouble finding the blog
from one of the v8 developers who went into details about it and other
optimizations. It was on HN a while back.
Here is a perf test with various v8 versions (as you noted not only JIT but
going going with the authors favorite).
[https://github.com/davidmarkclements/v8-perf](https://github.com/davidmarkclements/v8-perf)
~~~
jcranmer
My recollection is that the nonperformance of try/catch is something specific
to the v8 JIT. Historically, the zero-cost exception model of try/catch has
you do absolutely nothing with the data until someone throws an exception, at
which point you work out how to map the return address to the appropriate
catch handler.
------
Klathmon
>The try/catch is a very costly operation. You should write your code in such
a way that exceptions are not uncertain.
I believe all major JavaScript engines now optimize try/catch.
And SharedArrayBuffer is now disabled by default due to Spectre. I don't see
it coming back soon either.
Also, I dislike this article because it recommends these things "always".
"Always use forEach", "always avoid delete", "always...". Not to mention it's
yet another v8 specific set of optimizations.
~~~
metalshan
Try/catch point is an in-general thing. Irrespective of language. Always
better to handle errors by yourself.
I was not aware of the browser deprecation of Shared array buffer. But this
(or something like this) is a part of the ECMA proposal which is in stage 4.
[https://github.com/tc39/ecmascript_sharedmem](https://github.com/tc39/ecmascript_sharedmem)
I said using forEach (or in-builts) are always a good practice. Now like you
have to. For an example, if you have an array with no holes in it, for is
better performant then forEach.
~~~
mschuetz
I find "for of" to be much more legible than "forEach". Not much difference in
general but the "for" the beginning immediately tells me that there is a loop
here, whereas I have to read more thoroughly to identify something as a loop
in the forEach case.
// for ... of
for(let node of nodes){
console.log(node);
}
// forEach
nodes.forEach(node => {
console.log(node);
});
~~~
Klathmon
For...of is fantastic and I use it quite often (especially in async/await
codebases as it doesn't create a new function context!)
However it is slow currently in all major browser engines. For fast-path code,
you should reach for something else (like a for(;;) loop), but for most other
loops I always tend to reach for for...of for it's ease of use and
compatibility with iterators makes it a pleasure to use!
~~~
mschuetz
Are you sure about that? "for of" is something that I'd expect a js compiler
to handle exactly the same way as forEach. I'd see how it might have been
slower right when it came out but browser vendors are quick in optimizing
their compilers.
At least [https://jsperf.com/for-vs-foreach/293](https://jsperf.com/for-vs-
foreach/293) shows that there is no significant difference in for of vs
forEach in my version of chrome. There is an insignificant difference
acoording to which forEach is actually slower. (One forEach case in that
benchmark is only faster because it does something different.)
Edit: looking at the comparison at the bottom, it seems like forEach is
actually significant slower than for of since chrome 61.
~~~
olliej
Lots of work went into making the perf of for(of) fast - after the initial
“make it semantically correct” bits.
There are many reasons it is faster, but it also has the nice property of
working sensibly on any iterable object rather than forEach that only iterates
by index, and has an annoying ‘if (index in object)’ in the loop.
------
gbuk2013
This is a very mediocre article that mixes some good advice with misleading
and frankly bad things.
Justifying forEach (which is several times slower than a for loop) with sparse
arrays (which are an anti-pattern because they take you out of "fast elements"
mode in V8 at least) is laughable in a sad way. It also has no "break"
functionality , which is important if we are talking about performance (i.e.
when iterating the array to find a specific member).
The advice for using array literals to insert elements is also bad for the
similar reason that it makes it easy to create sparse arrays.
That and "i>arr.length" in the example means the for loop will run exactly 0
times! ;)
Using "filter", "map" and "reduce", by the way, is also slower than using a
for loop, even if broken out into a separate function for purposes of
chaining. This is because they call a function on each iteration and function
calls (even with optimisation) are inherently more expensive. So, use them
only when this difference in performance does not matter.
The closure / timer example is just so convoluted. Of course the closure will
keep "bar" in scope - that's what closures do! The "foo" object doesn't
somehow magically own "bar" it just has a reference to it, same as the
closure. If something still references an object then GC will not touch it.
Similarly for the event listeners advice, which is also badly worded.
And if you do need to write a timeout-based loop, I respectfully suggest the
following construct:
const interval = 1000;
var timer;
(function loop () {
... do stuff ...
timer = setTimeout(loop, interval);
}());
Now you can even use "rewire" to control the interval during your unit tests -
bonus!
The Arrays vs Objects thing is just shallow. In reality, the advice is "it
depends". If you need to iterate, use an Array. If you need to access by key,
use an Object (or better a Map). If you need both (and this frequently happens
in my experience) then you have to decide depending on the size of your
structure.
~~~
sheetjs
> It also has no "break" functionality , which is important if we are talking
> about performance (i.e. when iterating the array to find a specific member).
The article is woefully misinformed regarding `forEach` and I agree that the
array methods in general are slower [1], but `some` will bail out on the first
true value returned. To be sure:
[1,2,3,4,5].some(function(n) { console.log(n); return n>=2; });
will not run the callback function after processing the 2
[1]
[https://news.ycombinator.com/item?id=7938173](https://news.ycombinator.com/item?id=7938173)
~~~
gbuk2013
"some" !== "forEach" ;)
Also, "some" still has the performance penalty of an extra function call per
iteration (as I know you know based on your linked comment) and we are talking
about performance advice here.
But good point - I've never actually used "some" before, so I learn something
new today - thanks :)
------
z3t4
While the points are correct, they are bad advice, except for the last point:
prefer O(1) over O(n) because all (premature) optimizations are evil! What
works in one engine for example (Firefox) might not work in Chrome, or the
next version of the same engine. So write stupid simple naive code until you
actually _need to_ optimize. Then go for algorithms first, changing something
from O(n) to O(1) will have a bigger impact then for example creating static
binaries (which most of the time will be slower because there will be no run-
time optimizations)
~~~
bryanrasmussen
The original premature optimization quote was specifically about spending a
lot of time optimizing complicated algorithms in an environment in which
computation was often business required if used so as a consequence one could
count on people staying around more than insignificant amounts of time to get
a response.
In the environments most of this would be used on people do not stick around
long periods of time, therefore any simple optimization that one can do in the
preliminary development of an application is not premature.
But this article sucks, as do most articles with a list of coding tips for a
particular language/platform.
------
fvdessen
I'm confused about the setTimeout example; the code as presented is creating
an infinite loop, so it does not surprising that the object used in the loop
is not garbage collected. But I would expect references inside a timer to be
collected once the timer has expired. Do they not ?
------
TN1ck
> Avoid O(n), try O(1)
Be careful with this, as dictionaries are not really O(1), in the worst case,
it can be actually O(n). When your code heavily evolves around the 'fact' that
it's O(1) try to make a version with a simple array iteration and compare
which is faster.
~~~
BillinghamJ
I believe a good rule of thumb is that looping over an array will generally be
faster than using a hash map until you have around ~1,000 items.
Very approximate and dependent on the case obviously, but I always find it
surprising how far you can go with just a simple array.
~~~
zaarn
From experience I would say it's closer to 100 items, though this is using Go
as benchmark. 100 items is also what I would expect considering CPU caches.
From 100 to 1000 items the array performance is not massively worse and
shouldn't matter much but a dictionary can beat it (by a tiny bit).
If you're over 1000 definitely use a dictionary or tree.
------
vorpalhex
This article is quite incorrect. Some of these points were valid many versions
ago (try/catch did have a cost with it several V8 majors ago, but that has
since been resolved).
And, again, for the last time, Node.js is not actually single threaded and may
whatever deity help you if you choose to start additional threads without
understanding how libuv works. This isn't java, the answer isn't to throw more
threads at it.
Honestly, it's this kind of stuff that really hinders javascript. So many
people write "expert" articles that show fundamental misunderstandings about
the core of JS (including setTimeout vs setImmediate!)
------
austincheney
> Closure & timer – a deadly combo
In practice you can avoid this problem if your function wrapping a timer is
not a method and the callback of the timer is a recursive call to the very
function wrapping the timer. In that case there is no object to access to get
to the timer's function and secondly the this function never nullifies or
removes from garbage collection until the last timer is called.
> setImmediate over setTimeout(fn,0)
Yes, agreed, but I wouldn't ever run timers in a loop to achieve parallel
asynchronicity. I would use native Node methods to achieve this. I would
however use timers to achieve sequential asynchronicity, also known as
polling, which is where setTimeout would be beneficial. Sequential operations
likely to always be slower than parallel operations, but sometimes you only
just need intermittent delays.
I agree that these recommendations in the article are really good ideas.
------
nwhatt
Any team with enough developers using Node should develop a handful who can
understand the v8 internals and help troubleshoot real-world performance
problems.
The most helpful resource I've found about v8 specific performance is the
bluebird optimization killers wiki:
[https://github.com/petkaantonov/bluebird/wiki/Optimization-k...](https://github.com/petkaantonov/bluebird/wiki/Optimization-
killers) wiki, and the v8 bailout reasons repo:
[https://github.com/vhf/v8-bailout-reasons](https://github.com/vhf/v8-bailout-
reasons)
~~~
jcranmer
The best advice I can give about performance engineering is "all performance
guidelines are bullshit." Compilers and JITs are constantly improving, which
means any specific guideline like "do X to avoid a JIT bailout" isn't
necessarily true going forward. Furthermore, the benchmarks that compiler
writers use to assess the utility of their optimizations aren't hand-optimized
unreadable kludges, it's generally typical applications.
Write readable, natural code, and compilers are more likely to eventually
optimize it, even if perhaps they do a bad job today. Only contort your code
to improve performance if you have profiles showing that you need to do so,
and even then, see if there's natural code that works first.
~~~
tejinderss
Do you think adding type annotations or using typed js (typescript/flow), the
optimisations could be improved? Since the types are providing more metadata
about the code?
~~~
jcranmer
No (with an asterisk). The state-of-the-art for JS optimization already
computes shape information (roughly equivalent to typing) and feeds that into
optimizations.
The asterisk comes from the costs of having multiple shapes for a given
variable. V8 (to my recollection, it could very well be out of date) generally
has a performance cliff between monomorphic and polymorphic functions: if the
added strictures of typechecking is giving you monomorphic functions, it could
see a performance improvement. Other JITs (again, to my recollection) are
generally happier to have polymorphic functions where the degree of divergence
is still small (say, two or three shapes), although having 1000 shapes is
going to be unhappy for everybody.
Note that this discussion is effectively microoptimization-level discussion:
don't rearchitect your code to enforce monomorphism at the expense of clarity
unless you have profiling evidence that the performance is necessary.
------
brwsr
> Avoid try/catch
haha, async await needs that by design!
~~~
maga
It's not required, you can use .catch since it's just normal promises:
const result = await doStuff().catch(errorHandler);
~~~
dcherman
So what do you think your result variable contains in the event that the
promise is rejected? Is this semantically equivalent to try/catch?
~~~
maga
(async () => { const result = await Promise.reject(1).catch(e => e); return
result; })()
// =>Promise {<resolved>: 1}
As you can see, the variable contains a promise returned by the .catch. It's
not equivalent to try/catch (in that case the variable won't be defined for
one thing), but it does allow for error handling without using try/catch if
one is inclined to do so.
In practice, I've been using async/await extensively in Node.js/Express, and
I'm yet to write a single try/catch. In Express, I simply use a wrapper
function that catches promise rejections and siphons them to error handling
middleware:
const asyncIt = fn => (req, res, next, ...args) => fn(req, res, next,
...args).catch(next);
------
drinchev
These tips are all great. Thanks for the article.
My 2 cents from working over 5 years with NodeJS is to be careful when you
sacrifice code readability over performance.
`.forEach`, `.map`, `.filter` are way more readable / maintainable than a 20
lines `for loop` with n+ var assignments.
As for `try..catch`, I use ( readability - again ) to follow a Promise chain.
return new Promise( resolve => resolve( JSON.parse( req.body ) ) )
.then( body => db.update( body ) )
.catch( err => /* will catch JSON.parse as well as db.update */ )
------
partycoder
If you want to squeeze more performance out of JavaScript, at some moment you
will have to deal with the gatekeepers of performance: inline caching, hidden
classes, deoptimizations, garbage collection.
------
maga
I feel like these "N performance tips" make it onto the main page mostly
because of the commenters coming up with counterpoints to each "tip".
~~~
iaml
Real performance tips are always in comments!
------
cmollis
Promise.all vs sequential awaits is a good tip but Only if the results of the
awaits are independent (obvious). I see that all the time. It’s easier to see
how inefficient that is when you’re chaining .then().. await hides that and
gives the impression that it’s parallel.
------
mulrian
_Second point is, global variables are not cleared by the garbage collector.
So if you continuously add more and more global variables (which are not of
future use), it will cause a memory leak._
Errr...
~~~
Skinney
It's an easy thing to do in JS, it happens if you forget the `var`.
~~~
Klathmon
only when not in strict mode, which is becoming more and more rare, especially
in javascript codebases (as opposed to one-off scripts)
------
aforty
What if I specifid the `global` scope, like `global.SOME_VAR`? Will that skip
the expensive search of the parent nodes?
~~~
Sawamara
It cannot. It still has to check whether there is a local variable in any
other scope that is above the one currently being executed, all the way to the
top where it finds global.
Same with window in a browser context. You could still have a "window"
variable placed between your execution context and the global context.
------
Lord_Zero
"Create class for similar kind of objects" would using ECMAScript 2015 classes
be acceptable?
------
brudgers
Profiling might be a useful addition to the list,
| {
"pile_set_name": "HackerNews"
} |
Trump adviser: $600 weekly boost in unemployment benefits to expire in late July - onetimemanytime
https://www.businessinsider.com/weekly-boost-unemployment-benefits-late-july-expire-kudlow-trump-2020-6
======
onetimemanytime
_" We're paying people not to work. It's better than their salaries would
get," Kudlow said._
| {
"pile_set_name": "HackerNews"
} |
Password Party: distributed password cracking in Javascript - goldmab
https://github.com/goldmab/password-party
======
goldmab
I wrote this thing and would like suggestions on how to improve it. Thanks.
| {
"pile_set_name": "HackerNews"
} |
Hackers in Space - c3o
http://events.ccc.de/camp/2011/Fahrplan/track/Hacker%20Space%20Program/4551.en.html
======
tonfa
FYI, it's a 3500 attendee camp. It is also almost sold out (presale
currently), so if you plan to attend and don't have a ticket, now is a good
time.
------
gimpf
Thanks to emerging low-cost satellite launches (like from
<http://interorbital.com/>), fun projects like <http://sat.mur.at/> are
already in the real of being possible. I may be an hopeless optimist, but some
relatively low-bandwidth hacker-operated satellite network within 20 years is
not _totally_ impossible.
~~~
eru
In twenty years, as long as you got something up, it will be high bandwidth
compared with today.
------
stcredzero
_phase two: Put a hacker into orbit._
Phase 2.5: console widow, figure out how to get him back down. Alternate phase
2: send cat into space instead, start new meme and initiate hacker war with
PETA. (jk)
~~~
pdelgallego
Phase 2 is already in progress.
Check out the Copenhagen Suborbitals guys, they launch succesfuly their first
suborbital rocket a month ago.
<http://www.copenhagensuborbitals.com/>
~~~
gimpf
Nice video on their page. I especially loved their "caution fragile" marker --
on a rocket!
------
peterwwillis
Considering how most hackers build stuff at hackerspaces ("i don't know how to
build this, so let's go with trial and error and learn as we go!") this sounds
dangerous.
But cool.
------
binbasti
By the way, we'll set up a Ruby village at the Camp. Come and join us:
<http://railscamp.github.com/ruby-village>
You can also just support our cause to spread some Ruby love with a few bucks
in our Pledgie, which is only 50% complete at the moment:
<http://pledgie.com/campaigns/15397>
------
derrida
Other interesting projects include an attempt to have a transparent open
source leaking platform to carry on Wikileaks legacy.
~~~
Joakal
OpenLeaks; "Instead of publishing the documents, OpenLeaks will send the
leaked documents to various news entities or publishers."
Wish I was kidding: <https://secure.wikimedia.org/wikipedia/en/wiki/Openleaks>
~~~
derrida
I agree this policy of OpenLeaks is a bit dubious. The project the Chaos
Computer Club are working on is called 'GlobalLeaks'. I cannot determine if it
is going to be respectable.
EDIT: Just saw that OpenLeaks are going to be at the conference. GlobalLeaks
is a group attempting to create open source leaking platform. It might be that
'OpenLeaks' = 'GlobalLeaks'
------
bazookaBen
Are there going to be presentation slides / videos / ustreams? Would like to
cross the atlantic to come see but can't.
~~~
rb2k_
They usually are recorded.
The last camp apparently was in 2007: <http://dewy.fem.tu-
ilmenau.de/CCC/CCCamp07/video/m4v/>
(More events: <http://dewy.fem.tu-ilmenau.de/CCC/> )
------
pasbesoin
I can't be the only one with a sudden, fond memory of the Muppets.
Which I find, indeed, hackerish. Jim Henson and crew hacked an entire
presentation medium, mainstreaming an entire character type (the puppet,
formerly relegated to kids shows and ventriloquists).
I hope this combination of earnest endeavor and levity can escape our gravity
well.
------
agentultra
Sounds like it might be a little more expensive than an arduino kit and some
free time.
Yet I remain optimistic that it might eventually only cost the same as a
transatlantic flight does today to get a 180lb human into orbit.
PCs were started this way...
------
Joakal
The Hacker Space Program should tackle the next piracy: 3D printing (Formally
known as Reprap) [0].
[0] <http://reprap.org/wiki/EndOfIntellectualProperty>
------
cyrus_
This is absurd -- the hacker movement is too small and undisciplined, and it
lacks substantial knowledge about engineering and physics, to make this its
central project for the next 30+ years.
I'm all for big projects, but the hacker community should focus on what it
knows how to do -- write software.
------
ANH
Neat, but in order for me to take this seriously they have to at least spell
'satellite' correctly.
| {
"pile_set_name": "HackerNews"
} |
The Spread of Feynman Diagrams in the USA, Japan, and the USSR (2004) [pdf] - pja
http://web.mit.edu/dikaiser/www/Kaiser.SpreadingTools.pdf
======
pja
Posted this in the discussion on Tesla's shipping difficulties & dctoedt
suggested it be posted on it's own.
Knoweldge transfer is _hard_. This paper is about tracking the spread of the
use of Feynman Diagrams (which helped enormously as an aid to the calculation
of QM results using perturbation methods) within the physics community, but I
think it contains lessons for all of us.
The work was expanding into a book by David Kaiser: “Drawing Theories Apart:
The Dispersion Of Feynman Diagrams In Postwar Physics”
[https://www.amazon.co.uk/dp/B002Y5W2X2](https://www.amazon.co.uk/dp/B002Y5W2X2)
~~~
tritium
I think knowledge transfer is hard, because teaching requires time for neurons
to reorganize or maybe fix themselves in place, in order to properly cement a
retainable fact. It’s just a quirk of the human condition that amounts of
memory persist at limited speeds.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: What are your tips to reduce drinking alcohol? - bentossell
I don't want to go teetotal, I enjoy a drink and drinking socially, at dinners, weekends etc. But there are times I drink too much and forget what happened the night before or get anxious that I did something or said something. I only ever drink on a weekend and not throughout the week.<p>Justin Kan just announced he's giving up drinking alcohol: https://twitter.com/justinkan/status/1120188473263050758 and I admire people who do this. But I always see info/tips on giving up completely, what about just keeping a lid on it?
======
gottebp
My grandfather was an alcoholic. It came about because of the great sorrow he
felt after his own father died in a car accident. A small drink each day
became a larger one. This iterates. Time passes. I remember when he was in
rehab once, as a child I was brought to visit him. His genuineness was so
magnified. He was such a good man sober; well known and loved by many in the
town through his small business. He never broke free for long though. Never
violent but never fully there because his faculties were always suppressed by
the effects.
After many years, when his health was failing, he begged us grandchildren not
to follow this path. The regret was so palpable. This was later in my college
years, and with all the parties it was hard to pull back. Augustine once
wrote, "To many, total abstinence is easier than perfect moderation". It is a
respectable path especially for anyone too far along. I am no teetotaler
either though. Moderation is also admirable.
I set two limits and they have worked fairly well in my own life:
1) No more than two reasonably sized drinks in a day.
2) Never drink alone.
I nurse those drinks along and savor each tiny sip. It has worked well for
over ten years and prevented ramping up into anything further. A suggested
third rule that has grown on me is:
3) Drink only to amplify joyful occasions never to drown sorrowful ones.
It may not be for everyone however perhaps this will help some. Godspeed.
------
La-ang
Change your social circle and engage in a radical (to you) sport that requires
lots of stamina and endurance (Muay Thai). In short, the lifestyle you have
loves conjuncting with alcohol. Sports are a major player. Not only the body
gets hammered but the mind as well. I used the same method before to quit
smoking. Even if you will show up at social events you won't have the same
longing for Booz as you used to. I firmly believe this apply to the general
public. Athletes in general do not drink as much as the average joe ( If they
do at all). It's funny how people want an easy not so harsh solution to their
problems, because there simply isn't. Once the elephant in the room starts
ballooning you need to set your ego aside and admit you need change; major
change.
~~~
ativzzz
I second this one. It is difficult to just change a habit on a whim. A high
intensity sport (like a martial art) requires changing lifestyle habits if you
really want to see serious progress.
~~~
La-ang
You missed my last point. It's not meant to be easy.
------
ilaksh
I think I have a different view on this than most people here.
It seems that there is a tendency to simply suggest that anyone who does
drinks to excess sometimes is an alcoholic and therefore is a special case who
cannot drink.
What I observe is that most (not only a special few) people who drink have
times where they over-do it and it usually has significant negative
consequences in those cases.
In my opinion people want to blame the individual because they love alcohol
and don't want to admit that it could be a problem in itself.
I think there is a basic thing working against everyone who intends to
moderate their alcohol consumption which is that alcohol reduces your ability
to make good decisions. So on a bad day or circumstance with a reduced
cognitive capacity, anyone can make a wrong decision about whether to have
another drink.
So personally I think the answer is just to find other activities that are fun
that don't require alcohol. Also this idea that anyone who runs into problems
is an alcoholic is false and effectively stigmatizes people who decide to quit
because it has the suggestion that they are an alcoholic and something is
wrong with them.
If you sometimes run into problems with alcohol it's not you. It's the
chemical. Don't let people try to blame you for it.
To me the answer for social lubrication is just to have some kind of structure
for group activities, in other words a game. It works fine. You don't see kids
walking around depressed because they couldn't go drinking with their buddies.
Kids have activities and play games and have fun. There is nothing about being
an adult that makes it so you can no longer have fun without alcohol.
------
neekburm
You might consider the Sinclair Method. [https://cthreefoundation.org/the-
sinclair-method](https://cthreefoundation.org/the-sinclair-method)
You take an opioid antagonist, like naltrexone, 1 hour prior to drinking.
Since drinking produces endorphins, which are blocked by the antagonist, the
brain stops associating drinking with pleasure, which results in a lower
desire to drink.
The downside being that if you drink without the antagonist, your brain
returns to its old patterns.
Anecdotally, my personal experience was after trying the method was that I no
longer wanted to drink, and when I did, with or without the naltrexone, my
problematic drinking behaviors mostly went away. I mostly abstain now.
~~~
throwaway127831
Upvoted. TSM worked wonders for me. I’m now past 9 months of total abstinence
after using TSM for about 8 months leading up to quitting.
The idea of counter-acting the dopamine rush one gets from drinking makes a
world of sense. Alcohol is a fundamentally addictive substance and naltrexone
can help loosen its hold on your brain.
Abstaining now is fairly trivial. I pretty much never have the impulse to
drink.
I think a lot of folks see Sinclair method as somehow cheating or “having your
cake and eating it too”. While that may be an alluring idea, the reality is
you wind up just not wanting to drink. I now associate drinking with how
naltrexone makes me feel, which is not the most fun experience.
------
paddy_m
When I lived in NYC I started inviting friends for a "walk and talk" instead
of going to a bar. Going to a bar is such a generic crutch for "I want to
spend some time with you". Walking around the city and catching up while on a
nice amble was a refreshing change. Saved some money too.
~~~
peepanpeter
This is a really good advice. I have started going on walks with my friends as
well instead of going to a pub. Inspires ideas and you get way clearer
thoughts.
------
hashberry
If you're not a "one or two drinks" type of person and alcohol is a negatively
impacting you, then you really have to permanently stop drinking like Justin
Kan. The average drinker "keeps a lid" by moderating their intake because they
know how to stop. It's that simple. If you don't know how to stop, then you
have to stop completely.
~~~
abstractbarista
This is something I guess I cannot understand about people who consume alcohol
to the point of blacking out... For me the consequences are so harsh I just
never seek to near that point. I achieve a certain level of buzz, and simply
maintain it, sipping throughout the social event. It's quite pleasant, doesn't
take many drinks to uphold, and I don't feel hungover in the morning. Maybe
some of it is just "brain wiring"?
------
lisper
> there are times I drink too much and forget what happened the night before
That's bad.
You may not be able to reduce your alcohol consumption. Not everyone can.
But...
> I only ever drink on a weekend and not throughout the week.
That's a good sign. So what's happening here is that alcohol lowers your
inhibitions. When you're sober, you have control over yourself, which is why
you can get through the week. When you start drinking, you lose that control,
so you keep drinking, and you pass out. Whether you can drink without crossing
that threshold is an open question. Like I said, not everyone can do that. For
some people, one drink pushes them over the edge.
Do this experiment: next weekend, don't drink at all.
If you can do that, then the FOLLOWING weekend have ONE drink and STOP.
(That's one drink for the WHOLE WEEKEND. Not one drink on Friday and another
on Saturday.)
If that works out OK, then the NEXT weekend, have two and then stop. Not all
at once. Pace yourself. At least an hour between drinks. If that works out OK,
then the NEXT weekend go up to three.
If at any time you find yourself thinking, "This is all OK, I can have one
more" then STOP. That's the danger sign.
If at any stage in this process you find yourself making excuses for why it's
OK short-circuit any step in the process, then I have bad news for you: you're
an alcoholic, and if you don't want to keep blacking out you're going to have
to just stop.
UPDATE: raztogt21 also had some very good advice: never drink alone. It also
helps to let your drinking buddies know that you're on this program so that
they can help keep an eye on you. If your friends ever encourage you to drink
past your limit, get new friends.
~~~
brodouevencode
My fear is that OP is an alcoholic (yes, you can be an alcoholic without
drinking - it's that you're genetically predisposed to having a drinking
problem). In this case abstinence is the only answer. If you have problems to
which you cannot give up even a weekend then that's a red flag.
~~~
lisper
Yes, I share your concern. That's why I framed the answer the way I did. But
not everyone who occasionally drinks to excess is an alcoholic.
~~~
favorited
And, despite popular belief, there are other ways to manage alcoholism than
the 12 steps & total abstinence.
~~~
brodouevencode
Are the other methods better? What are those methods?
~~~
favorited
One medical option mentioned upthread is the Sinclair Method. It involves
taking medicine like Narcan before drinking. The medicine blocks the
endorphins released by drinking, so you don't get the pleasurable "reward."
You basically retrain your brain, reducing your desire to drink.
When I was struggling with alcohol, I found that I needed a secondary reason
to manage my drinking. Once I decided that I wanted to start dating more, and
realized that my drinking and the related weight gain was an impediment, it
was an incentive to reduce my alcohol intake to a more responsible amount.
I didn't quit– I still love California wine and I'm building a nice
collection. And when seasonal beer that I like comes out, I definitely pick
some of it up. But it doesn't control me like it used to.
------
PaulHoule
My take is that you're better off finding a social life that is not alcohol
fueled.
I have never "blacked out" but I did get in trouble with my extended family
for going to parties with an open bar, drinking too much, and acting like an
ass.
That for me was a wake-up call.
Then there was the time that we made a huge amount of applejack and around the
holidays I would drink consistently quite a bit and find that if I didn't
drink my body felt warm and I was a little irritable and I figured that was a
sign of getting physically dependent -- that was a wake up call too.
More recently I've found that I usually wake up with a glucose reading around
95 (good but not great), but if I drink alcohol and have disrupted sleep I get
a glucose reading around 120 (flaming diabetic).
As a result of that it is really rare that I drink these days. Maybe I have a
beer or two now and then but not on a regular basis.
For me the consequences and fear of consequences has been sufficient
motivation.
------
ziddoap
YMMV, but after a substantially embarassing night (that I found out about the
following day) I realized that the feeling of wondering if I did something
embarassing (or, horror of finding out that I did) was quite a bit worse than
the good feelings I got from drinking that much.
Now, whenever I drink I keep those thoughts in the forefront of my mind. I
know I will have _more_ fun if I don't go overboard - with a bonus that I
don't need to worry about anything the following day. Whenever I'm approaching
my limit, I weigh each drink with these thoughts in mind. What will be more
fun? An extra drink, potential embarassment, and lots of anxiousness? Or a
glass of water, no embarassment, and no anxiousness?
I imagine this requires friends who wont egg you on. I'm lucky to have a group
of friends that understand when I say "I'm done for tonight", they say "okay,
no problem! want a water? pop?" rather than encouraging me to get black-out
drunk.
~~~
cimmanom
This. Except for me it’s the hangover rather than the embarrassment.
It can help to find something that isn’t water to drink that will make your
companions feel less like they’re leaving you out - be that soda, juice, club
soda, etc.
------
jppope
1\. Build a budget for drinking. Nothing caps it like limiting the amount that
you can spend on it. 2\. Have your socializing focus on activities as opposed
to drinking. E.G. If you are bowling you'll be more focused on the game and
than you are on the drinking. 3\. Drink a glass of water in between each
drink. Double bonus for reducing hangover effects 4\. Buy a breathalizer. Yep!
A weird one but you can keep the BAC below the driving limit and you'll be
pleasantly surprised. 5\. Change locations often, and walk in between
locations. First this is great for getting quality time with your friends.
Second, you can use this trick to save money (happy hour in one place, dinner
in another). Third, there will be prolonged periods of time between each
location that you will not be drinking.
Sounds like you could also benefit from just imbibing drinks that are limited
in alcohol content. Session Beers, Campari spritz, some sakes etc.
~~~
chirau
What if your workplace is part facilitator? Many a startup's fridges are full
of beers and other alcohol in full view. Which, I think makes both 1 and 2 a
challenge.
For 3, i feel like drinking a glass of water in between drinks, though a good
though, only eggs you on to drink more drinks since you know you are
countering it at each turn.
~~~
jppope
The author of the original post said that he doesn't drink during the week. I
would agree that a fridge full of beer makes it harder.
------
peepanpeter
This is just a personal anecdote, but after taking Ecstasy/MDMA once around 9
months ago i realised how bad the alcoholic "high" is. Since then i have been
drinking way less. Often only one or two beers, then i'm done. I guess what im
trying to say is to reflect over who you become and how alcohol affects your
personality, temperament and motoric skills when drinking alcohol excessively.
Knowing i will become bad at speech, get bad motorics, and start focusing more
and more on sex, and will start behaving like a cave-man towards women and men
alike makes me think of alcohol in a totally separate light than before.
------
bluewater
I started to really question my drinking over the last year or so. During the
week I would drink a few beers or glasses of wine a night with a martini or
other cocktail sprinkled in. The weekends would often involve more with a
heavy hangover coming after neighborhood parties. I’ve gone the route of
moderation and counting drinks and it was difficult to do, depressing and left
me feeling deprived much of the time. One day I was in a forum reading a
question much like this and someone mentioned a book called This Naked Mind by
Annie Grace. It’s been a game changer for me and I’d highly recommend it. She
actually suggests continuing to drink normally while you read it. She dives
into a lot of the science and really opens your eyes to the world of alcohol
around you. I haven’t stopped completely and I’m not sure if I ever will but
the pull that booze had on me before is gone and I can confidently feel I’ll
have a drink whenever I want; the idea of allowing yourself that freedom is
empowering. Over the last 6 months I’m down maybe 75%. The social pressures
are really the hardest for me now. The line of questioning you get is intense
and something I was guilty of doing myself to others who weren’t throwing them
back with me at the time. Best of luck in your own journey wherever it takes
you.
------
DanBC
Set a limit before you go out. Alternate alcoholic drinks with other drinks.
Chose smaller weaker drinks. Ask your friends and family for support with your
new lower limits.
[https://www.nhs.uk/live-well/alcohol-support/tips-on-
cutting...](https://www.nhs.uk/live-well/alcohol-support/tips-on-cutting-down-
alcohol/)
------
sjg007
Take it one day at a time. Tell yourself that today I will not drink. There
are other methods, such as to be curious about your behavior.. ask questions
of why you drink or ask yourself why you are drinking too much. You can find a
therapist who would help with that. Be curious and explore what are the
conditions that you find yourself wanting a drink and what reasons may account
for them. Can you replace alcohol with something else? Then there's naltrexone
and other medical interventions that work well. Essentially you redirect your
biology and associate alcohol with feeling sick so your body will develop an
aversion to it. Other methods include changing your group of friends if they
are a source of drinking as well unless are ok being sober in their company.
You can also join AA or find a sponsor. AA doesn't work for everyone but it
sure does show you a cross section of life and how alcohol impacts it.
------
cableshaft
For me, having hobbies that don't often include drinking seems to help. I play
board games, go to book discussions, writer's groups, attend hack nights, go
see a movie, go out hiking with people.
Depending on the group, there could be alcohol involved, but usually there
isn't, in my experience (or just one or two people will drink, and only a
glass or two).
For me it very much depends on who I'm with. I once dated a girl that loved to
get wasted on alcohol on dates, and I liked her so I drank a lot with her
myself. The current woman I'm dating gets sick if she drinks alcohol, so I
pretty much only drink a drink here or there when I meet up with her family or
I'm out with certain friends. I easily go six months without a drink now, and
usually not much more than a couple of drinks a month.
If I drink too much my next day is completely unproductive, and I hate that,
so that also helps to keep me from drinking too much also.
------
leesec
Hard limit at 3. Never more. That's it. You don't need more than 3 in a night.
Enjoy those 3. Never cross it.
------
thorin
I used to drink quite a bit as a student and for a few years after uni. It's
very common in the UK.
Now I have children I'd be up in the night quite a lot and if you were drunk
it would feel horrendous. Also there is the possibility that you might need to
be there for them if they were sick or need to go to hospital in the middle of
the night. Additionally I like to do a lot of sport in my free time and I
struggle to do anything even a gentle bike ride or walk if I've had a drink.
A couple of years ago I set myself a 2 drink limit. I've only broken it once
or twice and I don't see that as an issue. Most of the time it means I'm ok to
drive in an emergency, can be up early without a hangover and can stay active.
I have a beer at home 2 or 3 times a week and really enjoy it so I don't think
drinking alone is an issue for me.
Wish I'd done it years ago!
------
NicoJuicy
It's easy for social drinking, I did it to.
There are some great 0% beers out there in Belgium. I personally recommend
"Brugse Zot" ( with alcohol) and "Sport Zot" ( without alcohol), nobody will
see the difference and it will make it a lot easier. ( Both taste great)
If you go to someone at home, just bring 1 pack of each with you and drink the
0%. It's the best trick I found out to reduce social drinking.
Just drink 1 of 2 without and try it out. It's actually not much different,
the social "vibe" is the same.
The biggest difference seems to be when you drink water ( socially)
Ps. Only drank 1 evening in the weekend. But it was mostly when I had a lot of
stress from work, that I drank too much
------
craftinator
Don't buy alcohol. Seriously, when there's no booze around, I don't drink.
When there is booze around, I often end up drinking ALL of it. Don't buy it,
and you're good to go.
~~~
saddestcatever
True!
You can "out plan" temptation if you avoid making decisions in the moment.
Though - the same as a careful diet - social engagements throw a wrench in the
discipline machine. Restaurants, bars, parties, etc.
~~~
craftinator
Ah yeah, true true. I tend to deal with the social temptations by
"pauperizing", as my wife describes it. I was very, very poor in college,
would often eat before social engagements to avoid prepared meal prices etc...
So now when everyone is having drinks, I pretend I'm too poor to afford them,
always walk to the bar and grab a soda water on the rocks. Everyone assumes
I'm drinking a mixed, and I can always DD. It's interesting how being that
frugal can improve my discipline, rather than degrade it!
------
chirau
What is a 'normal amount'? Is there a way of calculating this?
I know people who drink three drinks and they are gone. I also know people who
drink 6 drinks a day, don't black out, get home safe and are fully(seemingly,
not sure) functional and productive the next day.
Reducing to 'normal' or some level has to be backed up by meaning i think. And
that meaning of normal is what I am trying to discover.
------
jchallis
Precommit. When you go to drink tell everyone (including the barkeep) your
limit. When you have had a few and your inhibitions drop, come back to your
precommitment.
Do not keep alcohol in your house - inconvenience is a friend of sobriety.
Remember the old saw about alcoholism : first you take a drink, then the drink
takes a drink. A little bit of a runaway process may not be in your control.
------
sunstone
One technique that's worked for me is drinking a good non-alcoholic drink. In
my case Beck's non-alcoholic beer is the best I've found. It's not cheap and
costs more than the the usual beer I drink.
It's good enough though that I hardly notice when I drink it rather than my
usual, traditional 5pm beer.
------
nf05papsjfVbc
Ensure you've eaten well.
Pick a drink which has a taste to your liking. Relish it while you drink it.
Drink a glass of water between drinks.
Stop after a couple of drinks. This is much easier if you've done the above.
(This may turn out to be rubbish advice but there may be a good idea or two in
there.)
------
ryanlol
I’m in the middle of a $5000 bet with a friend on who can stay completely
sober the longest.
This seems to be working great for both of us, it’s been almost a month with
no end in sight.
It’s crazy how satisfying non-alcoholic beers are, can’t say I really miss
“real” beer.
------
TheAlchemist
I would highly recommend the book "Alcohol Explained" by William Porter.
I found it very simple yet powerful - down to earth and practical, cold
explanation about every side of drinking alcohol.
------
raztogt21
1\. Never drink alone
2\. Socially, never have more than 2
Follow those two rules, and you should be fine.
------
bradstewart
Count your drinks. It's way too easy to just say "yea I'll have one more" when
you're not consciously thinking "well, I've had 6 already..."
------
vkaku
Looks like your system is already giving negative feedback and you are
reacting to it.
------
beavisthegenius
Remove the psychological pressure that's driving you to drink and you'll find
you don't want to drink. I changed my job and dumped all my drinking friends.
I'm also ok forging my own path without friends so that part was easy for me.
------
billybrown
[https://www.reddit.com/r/stopdrinking/](https://www.reddit.com/r/stopdrinking/)
------
npc_george123
I never keep alcohol in my house. If I am really craving a beer, I might buy a
six pack, drink one, then throw the other five in the trash.
~~~
chelmzy
Why not just purchase 1 beer without the six pack?
~~~
npc_george123
I buy the one beer unless it isn't sold individually. Only maybe once a year
would I buy a six pack and only drink one.
------
ko-ko-ko
Drink hard liquor.
It may depend on the culture of your country and social circle, but cutting
beer, wine and cocktails has allowed me to drastically cut my alcohol intake.
For example, people around me will often drink wine while eating lunch or
dinner. Or they might offer you a beer if you're coming in on a hot sunday
afternoon. Or if you're watching TV together. In all of these situations,
drinking hard liquor or asking for it would be strange socially speaking.
There are only a few occasions (before dinner with appetizers, after lunch as
digestive, while having "a" drink after work...) in which it's okay to drink
liquor and to be the only one doing so. And if you're invited, it would be
rude to ask for a lot of liquor from your hosts (because it's more expensive
than wine and they likely did not prepare for it anyways so the supply is
short).
IMO It's also easier to feel you're getting drunk with liquor, because it's
not as gradual as beer: when you stand up after having 3 or 4 drinks, it hits
you right away that you've had too much. So you know it's time to switch to
water.
~~~
brodouevencode
This is terrible advice.
~~~
ko-ko-ko
It would be if OP were an alcoholic which he clearly isn't.
OP does not mention dependence issues, his drinking is not even habitual and
the effects on his social life or personal health are minor. All the advice
about 12 step programs, lifestyle changes with exercising and teetotaling is
going overboard b/c that is clearly not what OP asked about.
Op wanted tips on how to drink "sustainably", I find that drinking light
alcoholic beverages is harder to control than hard liquor both for physical
and social reasons (more social pressure to drink "light drinks", more stigma
on "liquor")
~~~
brodouevencode
> It would be if OP were an alcoholic which he clearly isn't.
You don't know that, and can't be determined by a very abbreviated post.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: How is knowledge management done at your workplace? - wowsig
======
ScottWhigham
How about you start by seeding this with some useful info? I'm tired of seeing
one-line questions asking for "something that it would take me 10+ minutes to
answer well" here. It happens all too often:
\- Ask HN: A really broad question?
\- 5-10 really useful, helpful replies but OP never comes back to thread
| {
"pile_set_name": "HackerNews"
} |
Signs of Liquid Water Found on Surface of Mars, Study Says - uptown
http://www.nytimes.com/2015/09/29/science/space/mars-life-liquid-water.html
======
netcraft
The first I have heard of Don Juan Pond
([https://en.wikipedia.org/wiki/Don_Juan_Pond](https://en.wikipedia.org/wiki/Don_Juan_Pond))
although I wouldn't say it looks like a swimming pool
([http://www.lakescientist.com/wp-
content/uploads/2014/04/don-...](http://www.lakescientist.com/wp-
content/uploads/2014/04/don-juan-pond.jpg)).
Edit: Here is an article from brown that also draws parallels between Don Juan
and Mars:
[https://news.brown.edu/articles/2013/02/antarctica](https://news.brown.edu/articles/2013/02/antarctica)
~~~
dsfsdfd
Not so long ago it was stated frequently and with certainty that there was no
liquid water on Mars. That there could be no liquid water on Mars. Which
having read into the topic I thought was obviously non sense. Now every one is
like there's water on mars, we've know for a while - you can see the flows. In
fact we deliberately don't go to places likely to be wet for fear of cross
contamination. I feel like I have been fed spin. Why don't other people see
this and more importantly why aren't other people angry at being mislead? I
feel like there is a consistent attempt to manipulate the truth before it is
presented for public consumption, in many spheres of life. Who are these
people that deem themselves entitled enough to decide what is known and what
is unknown?
~~~
zornthewise
I don't think most people intentionally do this. It just makes someone look
smart if they proclaim things confidently and so they do it. I try to be as
honest as possible when communicating and I have caught myself sometimes
stating things with more confidence than I had. I think it might just be part
of how humans communicate...
------
brianstorms
Dumb question probably, but here goes:
Given the latest NASA news about water on Mars, as in, “flowing water on
today’s Mars,” I am still trying to figure out how that would work.
If water boils at 79ºF when the atmospheric pressure is lowered to 0.5psi,
what would it boil at given Mars’ atmospheric pressure of 0.087psi?
How could Mars, which barely has an atmosphere, support liquid water? Why
would not the water basically evaporate quickly if not boil off almost
instantly?
~~~
barkingcat
Those numbers are for pure water. If water is mixed with something, salts (not
just sodium salt, but salts of other elements as well), etc, that changes the
properties of the resulting mixture.
The water on mars is probably "salty" mixture, maybe slush, but we don't
really know what it's consisted of unless we have measurements / spectral
analysis of the water mixture.
~~~
m-app
Indeed. NASA showed the following slide, comparing the different temperatures:
[http://snpy.in/NdFEnS](http://snpy.in/NdFEnS)
------
nmc
Official NASA announcement: [https://www.nasa.gov/press-release/nasa-confirms-
evidence-th...](https://www.nasa.gov/press-release/nasa-confirms-evidence-
that-liquid-water-flows-on-today-s-mars)
------
crusso
_R.S.L.s are treated as special regions that NASA’s current robotic explorers
are barred from because the rovers were not thoroughly sterilized_
We're not going to sterilize our research equipment going to Mars and we're
not going to search likely sites where we could find life because we don't
want to contaminate them. And yet, looking for signs of life native to Mars
(or even water which might support life) is the top reason we're there,
judging by the headlines. Seems like someone in NASA needs to work out this
conflict.
~~~
gambiting
The article mentions how equipment that needs to be sterilized is baked to a
very high temperature - but I wonder, can it not be sterilized with radiation,
like we already do with food? All electronics on a rover are already radiation
hardened, after all Curiosity has to work while having a chunk of plutonium
strapped to its back. Is this not an option of some reason?
~~~
gee_totes
What happens when Curiosity contaminates the RSLs with radiation?
~~~
jerf
Humans did not invent radiation, you know. In fact we live in a remarkably
well-shielded area. Certainly better shielded than the surface of Mars.
Besides, we're on another planet. If our payload was nothing but the nastiest
chemical we could come up with, we still couldn't ship enough to Mars for it
to matter.
No, the only payload that could conceivably be harmful is one that can self-
replicate, which at this juncture means life. (Ask again in a hundred years.)
------
leonardzen
Wouldn't it be bad if life forms are found in Mars, according to this?
[http://waitbutwhy.com/2014/05/fermi-
paradox.html](http://waitbutwhy.com/2014/05/fermi-paradox.html)
~~~
NhanH
Speaking in Bayesian probability, it's bad if life forms found on Mars are
more advanced than human, and good if it's much less advanced, or even very
primitive (as it means the Great Filter might be at an earlier stage than us
right now).
And it won't be the former case, for obvious reason.
~~~
thaumaturgy
Finding life on Mars would still change one of the polynomials of the Drake
Equation, which would shift upward the overall expected probability for
intelligent life in the cosmos, which is probably what leonardzen means.
But "we don't have enough data yet" still seems like a good answer to the
Fermi Paradox anyway.
~~~
hanspeter
It would shift upward the overall expected probability for intelligent life at
_our level_. However, since we have no evidence of life beyond our level of
intelligence, it also increases the probability that civilizations will not
survive long enough to become much more advanced than our civilization.
------
coryfklein
> For the water to be liquid, it must be so salty that nothing could live
> there, he said. “The short answer for habitability is it means nothing,” he
> said.
This dose of realism dampens my enthusiasm _slightly_ , but if we can find
such clear signs of water via satellite image I think it is a great indication
that water is more present at the surface than previously thought.
~~~
smchang
Although, just a couple of paragraphs down
>“If it was too salty, they would be flowing year round,” Dr. Stillman said.
“We might be in that Goldilocks zone.”
------
peter303
Solar system life may turn turn out t chemically similar. That is because it
arose on one body, then cross-infected other other bodies by meteorite
transfer over many millions of years.
Mars became geologically stable before Earth, so it could have been the
earliest place for life. Then Martian meteors infected Earth.
~~~
chadzawistowski
How could a meteor have launched from Mars and reached Earth? It stretches my
mind to imagine a situation where a meteor drops down, picks up life, then
drops it off on the next planet. Are there records of meteoric impacts like
that?
The best situation I can think of is a catastrophic volcanic explosion which
launches a chunk of the planet into space, but that still sounds far-fetched
to me.
~~~
cLeEOGPw
Here's
([https://en.wikipedia.org/wiki/Martian_meteorite](https://en.wikipedia.org/wiki/Martian_meteorite))
an example how. Asteroid impact.
------
eddd
I know that without water life would have never begun, but I still don't fully
understand why it is necessary. It is still big discovery though, if life is
that common I think we should be more careful during migration there,
"Marsian" common cold could wipe out the entire human colony.
~~~
dangrossman
> I know that without water life would have never begun, but I still don't
> fully understand why it is necessary.
"Life" is a bunch of chemical processes. A necessary prerequisite is therefore
a solvent in which those chemical processes can occur, and which can move
substances around, whether it's within a cell, an organism, or an environment.
Of all the potential solvents, solids can't move stuff around, and in gases,
only volatile and highly concentrated substances would have a chance to react.
It's also _too_ easy to move around in a gas: potential gas-based "life" would
just...fall apart.
So the solvent and carrier being a liquid is also likely a prerequisite for
life. It also has to stay a liquid at a relatively wide range of temperatures,
and be abundant enough so that all the rare coincidences that might lead to
life have a chance of happening and perpetuating.
Water is the only molecule that fits the bill, and happens to be the second
most common molecule in the universe.
~~~
eddd
I am not good at chemistry, but liquid methane–ethane wouldn't fit the
profile?
~~~
eddd
Actually I found comprehensive answer here if anyone is interested:
[http://www.space.com/13639-alien-life-methane-habitable-
zone...](http://www.space.com/13639-alien-life-methane-habitable-zone.html)
~~~
VLM
It doesn't really say "why". Water ionizes really conveniently so you can make
it an acid OR a base (which is sorta unusual and it can talk to acidic or
basic things), and is polar (its got a + and a - electrical side, more or
less, so it can talk to + or - things), and has hydrogen bonds (so you get
liquid lifestyle at gas temps)
For one why, look up amphoteric and zwitterion. Water has a super convenient
pH range where about a ten millionth of pure water self ionizes aka ten to neg
seventh or pH of 7, and all kinds of super convenient reactions occur above
and below that pH and its really easy to manipulate ionization rates around
that level without using too much. You don't need just a liquid but one where
you can really screw around with something like amino acids by easily and
cheaply changing the ionization rate of the liquid. So there are super
convenient chemical reactions that depend on the ionization level of the
liquid and its really easy to manipulate water. Conveniently water "just
works" without having to add tons of other stuff to it.
Liquid methane isn't polar enough to really be useful when messing with ionic
substances (table salt, etc). Whatever you use for a liquid, it needs to be
polar so ionic stuff like salts can dissolve. It turns out that interesting
chemistry doesn't happen with non-polar substances at normal temps; thats why
when you bury stuff for a couple million years the only thing left behind
undecayed (more or less) is non-polar hydrocarbons (crude oil). Ammonia is
polar but has other issues.
The hydrogen bonds are important. Water should be a gas at room temp. Really,
it should, looking at bonds and molecular weight and stuff. Yet the hydrogen
bonds that form keep it liquid at room temp. So you get "gas speed" chemical
reactions at "high" room temperature, yet its liquid to a ridiculously high
temp.
Maybe some custom liquid silicone with some bolted on weirdness could make a
useful artificial blood plasma or "stuff" for life to live in. How it would
make it without the chemical plant being made first is mysterious.
~~~
cLeEOGPw
Sometimes I wonder that not only it is not possible for other forms of life to
exist that wouldn't use water, carbon and others, but that we have life just
because it so happened coincidentally that our universe were created with the
exact properties needed for the laws of physics to support life for a short
while in some places.
------
tomkwok
NASA's 'Mars Mystery Solved' Press Conference starts at 11:30 a.m. EDT.
Live TV:
[http://www.nasa.gov/multimedia/nasatv/](http://www.nasa.gov/multimedia/nasatv/)
~~~
sjg
Seems like the the NASA TV Site is not loading for me here.
[http://www.ustream.tv/NASAHDTV](http://www.ustream.tv/NASAHDTV) is working
however.
------
peter303
They've seen seep-like events in craters and canyons for over ten years. This
study determines it to be water and not something else.
------
beambot
They won't explore the water-flowing sites to: "minimize the chances of life
inadvertently crossing the solar system."
So let's just spread life intentionally. It seems like that is one of our
goals anyway -- not necessarily for governments (ie. NASA), but certainly for
private space flight.
~~~
rm445
The aim is to settle the question of whether life exists on Mars, or ever has,
before clouding the issue with modern contamination. I say modern just in case
of various panspermia-type possibilities.
That doesn't preclude eventually spreading life to Mars. Even if living
indigenous bacteria, or even plant life(!) were found - I think this is the
most ambitious scenario now conceivable, the 'Mars mat' of fiction suviving in
caves - it likely wouldn't be an argument to stop colonisation, though it
might have bearing on arguments about terraforming.
------
charleywolters
I mean that's great news but didn't they announce this like 10 times before?
Isn't there a meme about this, that NASA announced they found water on Mars
like once a year?
~~~
peter303
Soem of the times were for ancient water. But they have shown evidence of
recent water before too.
------
Tinyyy
Youtube Live link, works reliably:
[https://www.youtube.com/watch?v=HDh4uK9PvJU](https://www.youtube.com/watch?v=HDh4uK9PvJU)
~~~
mkobit
Another link/stream/mirror (for those using Ctrl+F) -
[http://mars.nasa.gov/news/whatsnew/index.cfm?FuseAction=Show...](http://mars.nasa.gov/news/whatsnew/index.cfm?FuseAction=ShowNews&NewsID=1856)
------
foota
Does anyone else think that the form of life found on Mars would most likely
be completely distinct from what is found here, down to the molecular and cell
level?
~~~
mangeletti
Other than the possibility that the origins of life and potential life on Mars
are the same, I think the same thing as you.
Imagine it is completely distinct, and doesn't have cells.
~~~
foota
Yeah, I missed that caveat. I'm really excited to see what it might be and how
it could potentially change our definition of what's living.
------
iridium127
Would that mean that this water have a very high salt content since it is not
frozen?
~~~
exodust
They said "briny water" so yes.
------
madhurbehl
So where is the liquid water coming from ? and as per the article if the
evidence suggests that water would have flowed just 'days' before, is it in
the realm of possibility to detect actual water from MRO ?
~~~
coryfklein
FTA
> “There are two basic origins for the water: from above or from below,” Dr.
> McEwen said. The perchlorates could be acting like a sponge, absorbing
> moisture out of the air... The other possibility is underground aquifers,
> frozen solid during winter, melting during summer and seeping to the
> surface.
Although "rain"/humidity is unlikely, the article also discusses why it can
still be considered a possibility since we don't have good humidity
measurements at the surface.
------
coldtea
Maybe arsenic-based life too? This special announcement for merely "signs of"
(instead of corfirmation) speaks of PR and the need to secure next years
budget...
~~~
sampo
Currently the mainstream opinion is that the bacteria didn't use arsenate, but
were very good at using the small amounts of phosphate that was still present
in the experiment. And that the experimenters were not very good at cleaning
all the phosphate out of the growth medium.
So the highly publicized 2010 study is now pretty much falsified.
[http://www.nature.com/news/arsenic-life-bacterium-prefers-
ph...](http://www.nature.com/news/arsenic-life-bacterium-prefers-phosphorus-
after-all-1.11520)
[https://en.wikipedia.org/wiki/GFAJ-1#Criticism](https://en.wikipedia.org/wiki/GFAJ-1#Criticism)
------
aidos
I'm so confused. "Definitive signs"? Is this the Nasa announcement or
something else? I thought it didn't start for another 15 minutes.
~~~
barkingcat
This is most likely the Nasa announcement. Remember folks! Science doesn't
happen in a vacuum.
The scientific papers that outline this analysis was already available in
astronomy circles - the Nasa event is the press conference letting the bigger
community know.
It's not like it's a secret! All the measurements were already made by the
Orbitor and is presumably analysed by researchers around the world prior to
this unveiling.
~~~
fudged71
"Science doesn't happen in a vacuum."
This is NASA we're talking about, right? ;)
------
codecamper
This is good news for Apple. Liquid Spill Indicators will generate revenue on
Mars too!
------
amai
If there would be signs of oil the US would invade Mars tomorrow...
------
betolink
This is just amazing!!
------
mspokoiny
This is amazing!!!
------
sidcool
Confirmed! Mars has liquid water.
------
a3n
If there is water of any sort on Mars, ice, liquid, salty, whatever, I think
finding life on Mars is inevitable. Life is very, very persistent, and niches
will be filled.
Unfortunately it's also all but inevitable that we'll bring some with us, if
we haven't already. Life is persistent, and ever-surprising.
~~~
tempVariable
Interesting. I read that in Jeff Goldblum's voice as well. Were there any
missions that lifted off Mars and came back to Earth ?
~~~
jkaunisv1
No, we don't have the capability to land something on Mars with enough fuel
for it to take off again.
~~~
wtracy
We don't need to:
[https://en.wikipedia.org/wiki/Sabatier_reaction#Manufacturin...](https://en.wikipedia.org/wiki/Sabatier_reaction#Manufacturing_propellant_on_Mars)
~~~
jkaunisv1
You're right! Even though I cited the Sabatier reaction to a coworker today
when discussing the news, I didn't really think about it when answering. Also,
for some reason I thought it wasn't yet a technical reality.
I guess also wrapped into my statement was the thought that to be able to
launch something back it would have to be landed very well and require some
very complex machinery to prepare for launch, given the success rates of just
being able to land something on Mars without it breaking. I went with the
first reason off the top of my head :)
------
nsxwolf
Latest incarnation of the perennial headline... Best evidence yet of water on
Mars! I've been enjoying these all my life. This and "Nuclear fusion created
in a laboratory for the first time" headlines.
My favorite best-evidence-yet-of-water-on-Mars was that time the Phoenix robot
scooped up an ice cube and took a picture of it.
~~~
ForHackernews
This finding is about contemporary, liquid, water.
~~~
nsxwolf
Not a new discovery in itself, either.
~~~
kenbellows
yes it is...?
~~~
coldtea
You haven't been following these news very closely, do you?
2008: NASA Spacecraft Confirms Martian Water, Mission Extended
[http://www.nasa.gov/home/hqnews/2008/jul/HQ_08_195_Phoenix_w...](http://www.nasa.gov/home/hqnews/2008/jul/HQ_08_195_Phoenix_water.html)
2009: Meteorite Impacts Expose Ice on Mars [http://science.nasa.gov/science-
news/science-at-nasa/2009/24...](http://science.nasa.gov/science-news/science-
at-nasa/2009/24sep_martianice/)
2010: NASA Trapped Mars Rover Finds Evidence of Subsurface Water
[http://www.jpl.nasa.gov/news/news.php?release=2010-355](http://www.jpl.nasa.gov/news/news.php?release=2010-355)
2011: Observations from NASA's Mars Reconnaissance Orbiter have revealed
possible flowing water during the warmest months on Mars.
[http://science.nasa.gov/science-news/science-at-
nasa/2011/04...](http://science.nasa.gov/science-news/science-at-
nasa/2011/04aug_marsflows/)
2013: Curiosity's SAM Instrument Finds Water and More in Surface Sample
[http://www.nasa.gov/content/goddard/curiositys-sam-
instrumen...](http://www.nasa.gov/content/goddard/curiositys-sam-instrument-
finds-water-and-more-in-surface-sample/#.Vgn9dKZrjdQ)
2014: New Evidence for a Mars Water Reservoir
[http://science.nasa.gov/science-news/science-at-
nasa/2014/19...](http://science.nasa.gov/science-news/science-at-
nasa/2014/19dec_marswater/)
~~~
mehwoot
First three links and fifth are all talking about ice, not liquid flowing
water.
Fourth link isn't conclusive
_These results are the closest scientists have come to finding evidence of
liquid water on the planet 's surface today._
The last link summarises that liquid water has not been conclusively found
yet, and it is the most recent:
_While recent orbiter missions have confirmed the presence of subsurface ice,
and melting ground-ice is believed to have formed some geomorphologic features
on Mars, this study used meteorites of different ages to show that significant
ground water-ice may have existed relatively intact over time.
Curiosity’s observations in a lakebed, in an area called Mount Sharp, indicate
Mars lost its water in a gradual process over a significant period of time._
So none of your links contradict what the person you are replying to said.
| {
"pile_set_name": "HackerNews"
} |
For Taylor Swift, the Future of Music Is a Love Story - prostoalex
http://online.wsj.com/articles/for-taylor-swift-the-future-of-music-is-a-love-story-1404763219
======
fpgeek
Interesting counterpoint: [http://www.vox.com/2014/7/7/5878603/taylor-swift-
doesnt-unde...](http://www.vox.com/2014/7/7/5878603/taylor-swift-doesnt-
understand-supply-and-demand)
~~~
memonkey
I'm curious what kinds of solutions are in the works in a post scarce music
industry? I'm skeptical of the authors solutions. I'm also ignorant of the
music industry but if I understand correctly, things like Spotify and Youtube
are a broken model, usually only paying out the label and disregarding the
artist or paying them in change. What kind of business model can/will focus on
the artist, or are artists and their fans on their own in the future?
~~~
btown
Ignoring blips like Google's current bender of destruction of advertising
revenue streams for small labels, I can see larger labels becoming a thing of
the past. What is the purpose of a label in the digital age, now that it can
no longer include physical manufacturing and physical distribution in its
value proposition? To consumers, it's quickly becoming no more than a
guarantee that associated artists are curated and are of "high enough" quality
to the ears and eyes of some hidden tastemaker. And to producers, they're
increasingly a glorified PR firm. The largest labels' business models don't
match up with these value propositions; they function as gatekeepers in an age
where unlocked gates are plentiful. This is not to say that all labels will
disappear; smaller, niche labels such as Monstercat, Neon Gold, and even
Roadrunner Records (in the 90s) function[ed] like well-branded aggregators,
exactly meeting those needs and having a well-defined target audience. Perhaps
we'll see small labels turn to subscription models, allowing access to
exclusive content from their members? Imagine iTunes with tons of small label-
run channels one could subscribe to for a monthly fee, and get access to all
the music within? Lots of chicken-egg problems here, but it's interesting to
think about.
~~~
louhike
Labels are useful for recording,, mixing and marketing. They are not just
factories of CDs. Artists may manage without them but it is quite complicated
as it costs a lot.
~~~
Exenith
When it comes to a lot of electronic music, there is no recording necessary
and the artist is the person who mixes/masters. No need for money there
Side ramble: I've never understood why people paint "real" bands as being lo-
fi, guttural, punk and romantic. It's a goddamn privelege to afford all that
equipment. Here's your true punk: pirate Ableton and upload a tune to YouTube
for free.
Back to the point, for a lot of styles, the _only_ point of a label is
marketing. But this can be a very useful tool -- even just acting as a quality
filter is useful.
~~~
louhike
You are wrong on the fact that musicians doing electro do the mix/master.
I know people doing electro, and they pay people to do this job. It is really
hard to do it correctly.
Even people like DeadMau5 and Daft Punk do not do the mix/master by
themselves. They just monitor the people doing it (look at some of their
interviews).
------
bobbles
It's interesting how she mentions the point of taking a recorded performance
and keeping it fresh by introducing guests on stage with each set.
Reminds me of how Louis CK under advise from George Carlin (I believe) throws
out his material and starts fresh every year.
For creative people now, the key will be those who continue to be creative,
not those attempting to ride on one creation to the end.
The same is happening in the mobile world in app stores now. People expect to
get the initial offering, and then see continuous improvement through their
free updates for life.
How will people manage to get their continuing creativity and effort supported
in this market?
~~~
aikah
> The same is happening in the mobile world in app stores now. People expect
> to get the initial offering, and then see continuous improvement through
> their free updates for life.
What's happening in the mobile world is in-app purchase.Where you are coerced
into buying digital goods one way or another. And it's coming to non-gaming
apps,trust me.That's the future.
As for Taylor Swift,she represents everything that is wrong with the music
industry.She is a mass market product,not an artist.
~~~
gr2020
> As for Taylor Swift,she represents everything that is wrong with the music
> industry.She is a mass market product,not an artist.
Not a fair characterization, IMHO. She writes her own songs, and manages her
own career. She might have mass market appeal, but I don't see how you can
argue she's not a legitimate artist.
~~~
soganess
Because he believes what she is producing does not meet the threshold
requirement of being deemed as art?
Just because she is referred to as an artist by an industry attempting to
profit from her production does not defacto her as such. Perhaps Taylor is
just a shrewd business person who has a natural understanding of what has
strong social appeal. What if she openly admit that was her intent? Would you
still call her an artist? I would not, and I imagine a great many others
wouldn't either. There is a lot to being an artist and this automatic labeling
is misleading and perhaps, if you are extra paranoid, nefarious in intent.
------
7Figures2Commas
> Music is art, and art is important and rare. Important, rare things are
> valuable. Valuable things should be paid for. It's my opinion that music
> should not be free, and my prediction is that individual artists and their
> labels will someday decide what an album's price point is. I hope they don't
> underestimate themselves or undervalue their art.
Music in the form of a physical or digital copy of a recorded track is not
rare. Album pricing is based on supply and demand. The latter has decreased
significantly in the past decade. Short of pulling a Wu-Tang[1], it would be
futile for artists to try to fight market forces.
If Swift really wants to discuss the value of music in the context of music as
an important, rare art form, focusing on what consumers pay for physical and
digital copies of recorded music makes about as much sense as valuing Monet's
Water Lilies series based on how much Water Lilies posters sell for.
There are several rights associated with music and people have been buying and
selling these rights for decades. Royalty Exchange[2] is an online marketplace
for these transactions, and there are even focused funds[3] that essentially
give investors the ability to treat these rights as an asset class.
Savvy artists focus first and foremost on ownership, not what consumers pay
for a song or album.
[1]
[http://www.theatlantic.com/business/archive/2014/05/for-5-mi...](http://www.theatlantic.com/business/archive/2014/05/for-5-million-
one-person-will-get-to-own-the-only-copy-of-the-new-wu-tang-album/371020/)
[2] [http://www.royaltyexchange.com/](http://www.royaltyexchange.com/)
[3] [http://rhmusicroyaltypartners.com/](http://rhmusicroyaltypartners.com/)
~~~
ch
I'm confused by your post. What makes these royalties valuable is that they
are tied to the revenues generated by the sales of the same physical and
digital copies of the music that you point out are technically not rare. But
without the false scarcity created by copyright and the associated
constellation of laws which surround it, these artifacts would not create
revenue and then wouldn't the royalties also be worthless?
~~~
7Figures2Commas
First, while it's true that the rights I refer to are generally valued based
on the strength of the royalty streams, this does not mean that buyers and
sellers value these rights in a strictly formulaic manner. As with any asset,
there are a variety of factors that might result in buyers paying a premium.
An investor with the ability to purchase rights associated with an Elvis
Presley song, for instance, would probably pay substantially more for each
dollar in royalties than they would for rights associated with a song by a
less famous artist.
Second, and most importantly, not all royalties are tied to CD and digital
music sales. These are mechanical royalties. There are also performance and
synchronization royalties, which can be significant. It's is entirely
possible, for instance, for a song that generates little in the way of
mechanical royalties to generate eye-popping performance or synchronization
royalties.
~~~
dobbsbob
The guy who wrote the Cheers sitcom TV theme song never had to work again and
lives off royalties.
------
hitchhiker999
She seems nice. The tone was more remarkable than the content (the content was
also interesting). It seems we've been inundated with this grotesque idea that
every single young artist is a boring dark repetition of the same old paradigm
('wilder' than the last).
Her point re: 'people bonding with an artist over a lifetime' may well be
salient. I must admit a touch of that with zeppelin / prodigy etc.
| {
"pile_set_name": "HackerNews"
} |