The onion is back, baby!
The onion is back, baby!
You can’t choose where you grow up. :(
I don’t think the Chinese room is a good analogy for this. The Chinese room has a conscious person at the center. A better analogy might be a book with a phrase-to-number conversion table, a couple number-to-number conversion tables, and finally a number-to-word conversion table. That would probably capture transformer’s rigid and unthinking associations better.
No, you’re thinking of the first scene of the movie where a fly falls into the teletype machine and causes it to type ‘tuttle’ instead of ‘buttle’.
Ok. I’m getting tired. You bested me this round. Have a nice day.
You say it’s the goal of the proletariat to protect the revolution, but why would they? Each proletariat would benefit from the revolution’s failure- they could live better lives as the bourgeois. You talk about the proletariat like they are some monolithic entity, with a single mind and goal. You talk big about helping the individual, but cannot see beyond their class. The proletariat is a person, with needs, desires and opinions. What father would hold the abstract ideals of the “revolution” over the life of his sick daughter? Any father I know would do anything for the safety of his children, even hoard life-saving medicine from others.
Communist logix
we need to abolish private property so everybody has equal power.
we class of people to maintain public ownership
After all, how can we enforce public ownership without a more powerful class of enforcers?
I’m pretty sure fused add multiply with store is part of the AVX instruction set.
Fahernhaters are always like, “nooo!! 40 degrees is so hot!!” Meanwhile, the fahrenchad’s resting body temperature is nearly 2.5 times hotter. All fahernhaters would die at that temperature.
I always find it very funny when someone suggests anarcho-something as a solution to all of capitalism’s problems. How exactly do you plan to enforce that? Do you think social pressure & shunning will do anything more than create a class of extremists with an oppositional philosophy?
Recursion makes it cheaper to run in the dev’s mind, but more expensive to run on the computer. Subroutines are always slower than a simple jump.
Hand written assembly is much more powerful than a turing-complete high level language because it lets you fuck up everything. Rust and python are way too wimpy to allow a user to destroy their computer.
So you made a meme about how your opponent is completely irrational and you are a paragon of logic and reason, and then proceeded to declare yourself the winner?
This happens to everyone. It happens because your brain registers the other person saying something before it actually understands what is being said. And when most people don’t know what someone said, they ask, “what?” without even thinking. Source: my intro to psych textbook.
Everything can be done in constant time, at least during runtime, with a sufficiently large look-up table. It’s easy! If you want to simulate the universe exactly, you just need a table with nxm entries, where n is the number of plank volumes in the universe, and m is the number of quantum fields. Then, you just need to compute all of them at compile time, and you have O(1) time complexity during runtime.
There are bindings in java and c++, but python is the industry standard for AI. The libraries for machine learning are actually written in c++, but use python language bindings. Python doesn’t tend to slow things down since machine learning is gpu-bound anyway. There are also library specific programming languages which urges the user to make pythonic code that can be compiled into c++.
I completely agree that it’s a stupid way of doing things, but it is how openai reduced the vocab size of gpt-2 & gpt-3. As far as I know–I have only read the comments in the source code– the conversion is done as a preprocessing step. Here’s the code to gpt-2: https://github.com/openai/gpt-2/blob/master/src/encoder.py I did apparently make a mistake, as the vocab reduction is done through a lut instead of a simple mod.
Can’t find the exact source–I’m on mobile right now–but the code for the gpt-2 encoder uses a utf-8 to unicode look up table to shrink the vocab size. https://github.com/openai/gpt-2/blob/master/src/encoder.py
Don’t know why people are down voting this. That’s canonically correct in the Jewish an Muslim traditions.