Surur t1_je6434v wrote

> They should get wages and shares

As you note, that is not a new thing, in fact its a normal thing. So the reason Bezos is so rich is that he works for Amazon and was given the biggest allocation of shares.

Do you have a problem with that?

What if early Apple workers became billionaires (like early Microsoft employees).

Do you have a problem with that?

Or early Tesla factory workers?

Do you have a problem with people getting rich from their shares?


Surur t1_je2ldsp wrote

GPT4 raises the following issues:

  1. Infrastructure and land requirements: Constructing large man-made lakes and the necessary infrastructure to reroute wastewater, filter out the algae, and perform pyrolysis is a complex and costly undertaking. Additionally, acquiring the land to build these lakes can be challenging, especially in densely populated areas.
  2. Water treatment efficacy: While algae can help remove some nutrients from wastewater, they may not be effective in treating all types of contaminants, such as heavy metals, pathogens, or pharmaceuticals. Depending on the composition of the wastewater, additional treatment processes may still be needed to meet water quality standards.
  3. Algae bloom control: Providing optimal conditions for algae growth can be challenging, and if not managed properly, can lead to harmful algal blooms (HABs). HABs can produce toxins and create hypoxic or anoxic conditions that harm aquatic life and negatively impact water quality.
  4. Greenhouse gas emissions: The process of pyrolyzing algae into charcoal requires energy, which may contribute to greenhouse gas emissions depending on the source of energy used. Additionally, there is the risk of methane and nitrous oxide emissions during the algae growth and decomposition process, which are potent greenhouse gases themselves.
  5. Climate conditions: The efficiency of algae growth for carbon capture and wastewater treatment is dependent on local climate conditions, such as sunlight, temperature, and precipitation. The performance of this approach may vary significantly across different locations, limiting its global applicability.
  6. Charcoal disposal and utilization: Once the algae is converted to charcoal, it needs to be disposed of or utilized in a way that prevents the re-release of captured carbon. This could include using it as a soil amendment, for carbon sequestration, or as a fuel source. However, each of these applications has its own set of challenges and limitations.
  7. Economic viability: The cost-effectiveness of this approach compared to traditional CCUS technologies or other carbon capture and wastewater treatment methods remains uncertain. A thorough assessment of the costs and benefits, as well as comparisons to alternative solutions, would be needed to determine its economic viability.

Surur t1_je074un wrote

> everything is a command to AI, it has no initiative. it drives to the field and stops, because to it, the task is complete.

Sure, but a fully conscious and intelligent human taxi driver would do the same.

AIs are perfectly capable of making multi-step plans, and of course when they come to the end of the plan they should go dormant. We don't want AIs driving around with no one in command.


Surur t1_jdzodsu wrote

If something is impossible it may not be worth doing badly.

Maybe instead of testing a student's ability to write essays, we should be testing their ability to have fun and maintain stable mental health.

I mean, we no longer teach kids how to shoe horses or whatever other skill has become redundant with time.


Surur t1_jdxugcy wrote

Informed opinions are always more valuable, especially when she makes technical claims like:

> But GPT-4 and other large language models like it are simply mirroring databases of text — close to a trillion words for the previous model — whose scale is difficult to contemplate. Helped along by an army of humans reprograming it with corrections, the models glom words together based on probability. That is not intelligence.


Surur t1_jdxeibf wrote

> anything can be a drive through

Then that is a somewhat meaningless question you are asking, right?

Anything that will clue you in can also clue an AI in.

For example the sign that says Drive-Thru.

Which is needed because humans are not psychic and anything can be a drive-through.

> AI requires specifics.

No, neural networks are actually pretty good at vagueness.

> I mean seriously, i can disable an autonomous car with a salt circle.

That is a 2017 story. 5 years old.