yboutros

joined 1 year ago
[–] [email protected] 38 points 3 days ago (8 children)

I wish more guys just said they didn't know something instead of clearly not knowing what they're talking about and running their mouth based on vibes

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago)

I sort of agree, but I think it depends on effort.

Type one word in and try and sell the easiest generated image? Low value.

But typing the right combo to create assets to create something larger than the model is capable of? That's more valuable.

Criticizing AI or artists that leverage AI is like criticizing an artist for using a printer instead of drawing by hand

Or saying someone's digital work is inferior because they used a tool to help make their image...

On that note, when working on a large project, is an AI artist as pretentious as the artist in the comic because they got some help generating the project from an AI instead of another human? Or is someone's work ethic less credible for Google searching instead of asking a person? Are works of art valuable because they're entirely original and uninfluenced by anything else but the artist themself? Because with that metric no artists are valuable since nothing is entirely original anyways

[–] [email protected] 24 points 6 days ago (2 children)

25% of reddit comments are chatgpt trash if not worse. It used to be an excellent Open Source Intelligence tool but now it's just a bunch of fake supportive and/or politically biased bots

I will miss reddits extremely niche communities, but I believe Lemmy has reached the inflection point to eventually reach the same level of niche communities

[–] [email protected] 6 points 6 days ago (2 children)

Don't tell him, if too many people get ad blockers they're just going to keep evolving

 

When training a transformer on positionally encoded embeddings, should the tgt output embeddings also be positionally encoded? If so, wouldn't the predicted/decoded embeddings also be positionally encoded?

[–] [email protected] 10 points 2 weeks ago (1 children)

You're right, we need water fountains with milk instead

[–] [email protected] 8 points 2 weeks ago* (last edited 2 weeks ago)

Meanwhile: NixOS

[–] [email protected] 2 points 3 weeks ago

538s model was a good estimator that year too, they leaned towards Hillary (and to be fair, she did win the popular vote) but certainly kept a trump win in the swing states within margin of error.

270 to win is another good site

[–] [email protected] 15 points 1 month ago

Fake. My parents didn't have a stable marriage

[–] [email protected] 1 points 1 month ago (1 children)

I'll look into LN more, I'm familiar with the centralization concerns (but still think they're able to be mitigate until more upgrades), but am not familiar with the costs you're bringing up. Fee estimators notoriously round up, I've never spent more than a dollar but that's anecdotal

BCH is still an attempt at centralization from bitmain, a company which literally installed kill switches in their miners without telling anyone, and ran botting attacks in /r/Bitcoin and /r/BTC during that fiasco - the hard fork they created is absolutely more centralized than Bitcoin

There will be a time to do something as risky as hard fork for a block size upgrade, but to do it for the sake of just one upgrade that serious doesn't make sense to me. If a hard fork must happen there might as well include other bips that necessitate a hard fork like drivechain.

Soft fork upgrades which enable more efficient algorithms like schnorr / SegWit in the meantime have scaled tps without having to waste block space. Bch is cheap because there's no demand or usage.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Fiat makes itself obsolete

[–] [email protected] 2 points 1 month ago (3 children)

Bitcoin cash was an attempt at centralized control by Jihan Wu. Just because the block size is bigger doesn't mean it's better for decentralization. In fact, the increased costs of maintaining a node just makes it harder for people in (typically poorer) oppressive countries to self verify

They are still increasing the TPS, lightning network isn't perfect, but it can scale beyond visa until more upgrades are implemented

[–] [email protected] 4 points 1 month ago

Ollama (+ web-ui but ollama serve & && ollama run is all you need) then compare and contrast the various models

I've had luck with Mistral for example

 

Went through the pain of packaging a python project on Nixos. Here's some issues I hit, and how I got lucky resolving them. I feel the most reliable way of doing this in the future is to use docker and just imperatively build.

Here's how I got web drivers, AI dependencies, gpu dependencies, and an api dependency bundled together into an ephemeral shell for python development, on NixOS 23.11

  1. Enable Flakes

  2. Start with setting up poetry2nix

  3. Get the template flake by running nix flake init --template github:nix-community/poetry2nix

  4. in the flake.nix, sometimes changing projectDir = self to projectDir = ./. fixed some issues

  5. in your terminal, run nix develop . to build the poetry app with python packages described in pyproject.toml

  6. By default, just poetry and python latest should be installed. the dependencies for the project (which gets reflected in the pyproject.toml) are updated with poetry add, such as poetry add numpy selenium scikit-learn

  7. Exit out of the ephemeral shell from nix develop ., and rerun to have poetry2nix rebuild and link the newly declared packages

Poetry2nix has worked pretty well for the more obscure python packages, but failed in others. For example, sentence-transformers would depend on maturin, which would fail to link setuptools. If poetry doesn't work, you can try and get the package from nixpkgs, or specify sha256s from pypi.org

Here's an example of what I added to my flake.nix to get gpu acceleration, sentence-transfomers, firefox drivers for selenium, and other packages poetry failed to setup:

packages = [ pkgs.poetry pkgs.python311Packages.sentence-transformers pkgs.firefox 
            pkgs.python311Packages.openai pkgs.python311Packages.yt-dlp pkgs.python311Packages.pyopencl
];

was added to this flake.nix, as in,

{
  description = "Application packaged using poetry2nix";

  inputs = {
    flake-utils.url = "github:numtide/flake-utils";
    nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
    poetry2nix = {
      url = "github:nix-community/poetry2nix";
      inputs.nixpkgs.follows = "nixpkgs";
    };
  };
  outputs = { self, nixpkgs, flake-utils, poetry2nix }:
    flake-utils.lib.eachDefaultSystem (system:
      let
        # see https://github.com/nix-community/poetry2nix/tree/master#api for more functions and examples.
        pkgs = nixpkgs.legacyPackages.${system};
        inherit (poetry2nix.lib.mkPoetry2Nix { inherit pkgs; }) mkPoetryApplication;
      in
      {
        packages = {
          myapp = mkPoetryApplication {
            projectDir = ./.;
          };
          default = self.packages.${system}.myapp;
        };
        devShells.default = pkgs.mkShell {
          inputsFrom = [ self.packages.${system}.myapp ];
          packages = [ pkgs.poetry pkgs.python311Packages.sentence-transformers pkgs.firefox 
            pkgs.python311Packages.openai pkgs.python311Packages.yt-dlp pkgs.python311Packages.pyopencl
          ];
          nativeBuildInputs = [(
            pkgs.python311Packages.buildPythonPackage rec {
              pname = "serpapi";
              version = "0.1.5";
              src = pkgs.python311Packages.fetchPypi {
                inherit pname version;
                sha256 = "b9707ed54750fdd2f62dc3a17c6a3fb7fa421dc37902fd65b2263c0ac765a1a5";
              };
            }
          )];
        };
      });
}

There was one package (serpapi), which was not in nixpkgs, and poetry failed as well. Adding this to native build inputs got serpapi installed

nativeBuildInputs = [(
            pkgs.python311Packages.buildPythonPackage rec {
              pname = "serpapi";
              version = "0.1.5";
              src = pkgs.python311Packages.fetchPypi {
                inherit pname version;
                sha256 = "b9707ed54750fdd2f62dc3a17c6a3fb7fa421dc37902fd65b2263c0ac765a1a5";
              };
            }
)];

All in all, it works, and I have no doubt I've made a reproducible environment. What attracts me is I've never had an easier time setting up cuda/cudnn/tensorrt/... system drivers have been near effortless, and much faster to setup than on debian. Tools like sentence-transformers and torch default to packages which leverage the GPU.

What pushes me away, is I've had failures in each of the three methods for specifying package dependencies, even though one of the three eventually was the fix for integrating the dependencies into my shell. For now, I'll stick with it, but it's hard for me to suggest to a team we use this in development

 

I setup a next.js project with pkgs.mkshell, and used nix develop to automatically build the project. However, when I leave the shell, the files persist. How should/can(?) I setup my shell.nix so that files in the directory it drops down into are automatically removed when leaving the ephemeral shell?

view more: next ›