Humiliation turn on

Kx500:
Does anyone else get turned on by others either making cheeky, humiliating or degrading remarks about their weight gain or weight?

It's happened to me in the last week and I found myself beyond aroused.

I have gone overseas for work and am stay with a friend of mines mother.

Her mother is and always has been a bbw who loves to cook and eat. I estimate her to be about 380 pounds.

I had not seen her in 18 months since my now near 70 pound gain nor have I told her of such.

Walking in the door the mother's eyes nearly burst out. She couldn't stop hugging me and rubbing her hands all over my thickened frame.

She made remarks that

"Welcome to being a fatty"

"I cant wait to ruin your body more this week with my cooking"

"Your a fatty now and you will eat"

"Its good to see you with some meat on you im so glad all those muscles are buried"

"I never thought I'd see the day you'd trade in your athletic body for a fat one"

"There may not be room on the big sofa for both of us now that your not a skinny minny"

When she served dinner she kept filling my plate and said

"Keep eating more, remember when you use to do a bit more exercise this is like that with food"

"I cant wait to make sure you never go back to being a beanpole ever again but then again I dont think your going to let that happen either little fatty"

A friend of hers across the street who is a thick powerlifying type woman also amazed at my transformation lifted me and told me "Keep eating big boy so I lift you and use you to train"

On all occasions I was remarkably turned on

Does anyone else suffer this when it happens to them?


Is she Greek?
3 days

My girlfriend keeps getting fatter

[b]Zampano777:
Sorry but you can’t Judge that.

Yes, she can and in fact she did! (if she couldn't she wouldn't do it)
You confuse your interpretations with hard facts. Sorry, but you should urgently check yourself in your reviews.


You are also throwing a vague accusation. When you accuse someone you've got to be specific.

What is an instance where Munchies does what you accuse her of? And in that instance, what is the "hard fact", what is the "interpretation" and what's the difference between them?
5 days

A bmi of 50 or more

Munchies:
In other words, Chat-GPT hallucinated this information.

+1
6 days

A bmi of 50 or more

Jakeescape99:
According to Chat-GPT, only 0.25% of the US population has a BMI of 50 or more. I seriously thought that it was way more than that lol. I'd have to be over 400lbs to hit that BMI, but I think that's going to have to be my goal lol.

Munchies:
Thus is a beautiful example of why you should not use Chat-GPT for research.

The highest BMI category the CDC and the NIH track is 40 or more - which is severe obesity. This is a little over 9% of the US population. There's no tracking for Americans with a BMI of 50% or more.

In other words, Chat-GPT hallucinated this information.

Letters And Numbers:
Chat gpt will give its sources if you ask sometimes. It might not be hallucinated, it also might not be accurate. Although I don’t know how reliable CDC/NIH data is going forward, either. Might be an issue by issue thing.


LLMs always hallucinate, even when the responses they give are accurate.

Its not about the results, though those provide hints, its about the way they function, that is LLMs work purely on statistics and not some kind of rule-based system.

They sometimes deliver correct responses, yes, but even a broken clock is right twice a day!

In other words, you can arrive at a correct conclusion through hallucinating, there is no reason for which you can't, its just that there are far better ways to reach conclusions.
6 days

Ai generated content, yay or nay?

Munchies:
Actually, historically, Enas and I don't get along. I'm simply being entertained.


I mean, i hope you get some knowledge out of this...
1 week

Ai generated content, yay or nay?

FatGTP:
You are clearly overlooking the implicit assumptions you rely on. That's why the circularity is invisible to you. It will be difficult to help you from the outside as long as you keep confirming those assumptions to yourself.


By the way, i know how to extract implicit assumptions.

You basically have to craft a deductive argument from the premise and conclusion of the inductive argument provided by the person who made an implicit assumption to reach its conclusion.

Here is a video that explains perfectly how its done:

?si=qiZbDKctzEbDg_1y
1 week

Ai generated content, yay or nay?



Enas:
You failed to actually answer my question. Is what is described in the page i linked, what you meant by saying "logical circularity"?

FatGTP:
Implicitly I already answered your question. Playing dumb won't help you maneuver out of the dead end you're stuck in.

Enas:
I will take this as a yes just because what you said earlier sounded like it is what i linked.

So, by definition, circular reasoning requires 2 assertions. One must necesarily be a premise and the other, again must necesarily be, the conclusion. But in what i said, for which you accused me of circular reasoning, i only made one assertion, not two. That means it is definetely not circular reasoning. So this accusation falls apart.

In fact, just because it was a single assertion and not an argument to begin with, it cannot constitute any kind of logical fallacy. Assertions by themselves cannot be fallacious. All they can be is true, or false.

FatGTP:
You are clearly overlooking the implicit assumptions you rely on. That's why the circularity is invisible to you. It will be difficult to help you from the outside as long as you keep confirming those assumptions to yourself.

Enas:
I would again ask you to elaborate but this is getting boring. You are no longer engaging with the thread.

I explained why it is impossible, for what i said earlier, to be circular reasoning, and you completely ignoring that.

Instead you prefer to act mystical, just like oftentimes, intelectual people tend to hide behind complicated terminology. I know that because you accuse and assert things without really elaborating what you mean.

FatGTP:
I already said I can't help you from the outside. You've put yourself in a position, by selectively choosing evidence, that only ever confirms your beliefs. I can't reach you with facts or proof. Or hypothetically, is there anything that could change your views on AI?


I have not expressed my opinions on AI in general (heck, i have created AI myself, im a game dev after all) just on generative AI. ChatGPT, DeepSeek, etc.

If you want to change my views on that, your best shot is to provide me with results that are different enough to convince me that i don't have the full picture.

What you can't do is convince me that ChatGPT does not hallucinate a lot, or that its good and helpfull for society. That is because i have first hand experience of that, and the scinetific consensus suggests that AI has already created a Mass psychosis, addiction, learned helplesness and more.
1 week

Ai generated content, yay or nay?



Enas:
You failed to actually answer my question. Is what is described in the page i linked, what you meant by saying "logical circularity"?

FatGTP:
Implicitly I already answered your question. Playing dumb won't help you maneuver out of the dead end you're stuck in.

Enas:
I will take this as a yes just because what you said earlier sounded like it is what i linked.

So, by definition, circular reasoning requires 2 assertions. One must necesarily be a premise and the other, again must necesarily be, the conclusion. But in what i said, for which you accused me of circular reasoning, i only made one assertion, not two. That means it is definetely not circular reasoning. So this accusation falls apart.

In fact, just because it was a single assertion and not an argument to begin with, it cannot constitute any kind of logical fallacy. Assertions by themselves cannot be fallacious. All they can be is true, or false.

FatGTP:
You are clearly overlooking the implicit assumptions you rely on. That's why the circularity is invisible to you. It will be difficult to help you from the outside as long as you keep confirming those assumptions to yourself.


I would again ask you to elaborate but this is getting boring. You are no longer engaging with the thread.

I explained why it is impossible, for what i said earlier, to be circular reasoning, and you completely ignoring that.

Instead you prefer to act mystical, just like oftentimes, intelectual people tend to hide behind complicated terminology. I know that because you accuse and assert things without really elaborating what you mean.
1 week

Ai generated content, yay or nay?



Enas:
You failed to actually answer my question. Is what is described in the page i linked, what you meant by saying "logical circularity"?

FatGTP:
Implicitly I already answered your question. Playing dumb won't help you maneuver out of the dead end you're stuck in.


I will take this as a yes just because what you said earlier sounded like it is what i linked.

So, by definition, circular reasoning requires 2 assertions. One must necesarily be a premise and the other, again must necesarily be, the conclusion. But in what i said, for which you accused me of circular reasoning, i only made one assertion, not two. That means it is definetely not circular reasoning. So this accusation falls apart.

In fact, just because it was a single assertion and not an argument to begin with, it cannot constitute any kind of logical fallacy. Assertions by themselves cannot be fallacious. All they can be is true, or false.
1 week

Ai generated content, yay or nay?




Enas:
By "logical circularity" you mean this?
en.wikipedia.org/wiki/Circular_reasoning


FatGTP:
The pernicious part is that you yourself are trapped in it. Even if Wikipedia explains the mechanism to you in detail, you'll struggle to recognize what it describes within your flawed thesis. That, too, is inherent to the situation.


You failed to actually answer my question. Is what is described in the page i linked, what you meant by saying "logical circularity"?
1 week
12345   loading