bitfucker

joined 4 months ago
[–] [email protected] 4 points 1 week ago

I am his distant cousin

[–] [email protected] 9 points 1 week ago

Anything not advertised as E2EE can be assumed to have some 3rd party able to look at the conversation, malicious or not.

[–] [email protected] 2 points 1 week ago

Man, and here some people are literally struggling due to the lack of dopamine just because their brains are built differently.

[–] [email protected] 3 points 1 week ago (2 children)

Where do you shit?

[–] [email protected] 10 points 1 week ago

Your taxes have been received. Have a great day!

[–] [email protected] 14 points 1 week ago (2 children)

Pay your due tax please

[–] [email protected] 12 points 1 week ago

You mean interaction right? ...right?

[–] [email protected] 4 points 1 week ago

Maybe got something to do with his username

[–] [email protected] 3 points 2 weeks ago

Has anyone watched Babish Culinary Universe and seen him drunk on Gatorwine? That could be surprisingly good... or not.

[–] [email protected] 2 points 2 weeks ago

Aye, that's a fair assumption

[–] [email protected] 5 points 2 weeks ago

Research and development is tricky because you will never know how much more progress you will need before reaching a satisfying result.

[–] [email protected] 2 points 2 weeks ago (2 children)

*Rant for the beginning of the article ahead

Why in the name of god did they try to bring LLM to the pictures. Saying AI/ML is good enough for predictive maintenance tasks, but noooo, it has to be LLM. If they want to be specific then don't be misleading, I think what they mean is the attention layer/operation commonly used in LLM to capture time series data. I understand that the Recurrent style neural network and LSTM has its limitations. And I agree that exploring attention to be used in time series data is an interesting research but LLM? Just no.

view more: ‹ prev next ›