lividhen

joined 1 year ago
[–] [email protected] 2 points 1 year ago

I have some nostalgia for when they introduced the appstore. Aaaand that's where the nice things I have to say end.

[–] [email protected] 1 points 1 year ago

Have a Z flip 5 which has the same soc as the s23 and I super love it! If you don't end up loving oneui you can always customize the heck out of it with samsung theme park or (third party) hex installer.

 

Currently typing this on my flip 5 connected to an external display via hdmi. This didn't work on the previous generations. Only on the fold. There is no toggle for dex which is exceptionally dissapointing. Does anyone know if you can launch it via adb?

[–] [email protected] 4 points 1 year ago (2 children)

They're stiff? That doesn't sound right...

[–] [email protected] 1 points 1 year ago

This is so funny 🤣
Any idea why it's called Disney vacation?

[–] [email protected] 26 points 1 year ago

I thought this post was satirical at first before I realized they were a real company 😅

[–] [email protected] 3 points 1 year ago (1 children)

It's so cool!!!
Do you have a pattern or a general path you follow when making it?

[–] [email protected] 3 points 1 year ago

I only own a switch to buy nintendo games to play them on my steam deck as I'd rather not pirate them if I have the option to pay for them.

3
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

So far I have tried a USB stick, a Linux storage gadget, and 2 micro SD cards. Both the USB and emulated storage device booted to the logo then a black screen where the fan spun down then sat there for 3 hours before I tried something else. The weird thing about both the micro sd's is that after booting them once the deck no longer recognises them as bootable drives. I have to reflash the recovery image before I can boot to it again. One of the cards (cheap nameless 16 gig) booted to the logo and sat there for several hours and the other (1TB Sandisk) booted to the logo then a black screen and is still sitting here 2 hours later.

Edit: leaving the 1TB overnight got as far as the touchpads vibrating when touched.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I wonder if we could use lspatch (or similar) to hook in and change that. I might have a project ahead of me. We'll see.

[–] [email protected] 0 points 1 year ago (2 children)

No idea if this would he possible but my starting idea here is using a server of some sort to grab the firebase notifications and push them all over one webocket similar to GMS. What I don't know is how registering apps works. I know from micro g they register themselves to receive notifications but I'm not sure of what data is actually given to micro g or the relationship between the app and micro g/the web socket.

[–] [email protected] 0 points 1 year ago (4 children)

I will have to look at how that works. Maybe I can run a server on a computer to push to ntfy.

[–] [email protected] 2 points 1 year ago (1 children)

My bad! I meant lspatch.

 

I have a us Samsung phone so I an unable to root and replace GMS with micro g. Would it he possible to patch apps (with lspatch for example) to use another app for push notifications? Firebase notifications are the only reason I have GMS at all still.

 

I need some help figuring out elastic search. My end goal at the moment is to get the full text search owncloud app working. They are both in docker containers (docker compose). I am able to input my url in the owncloud settings (http://es01:9200) and hit setup index. After that it does not index anything or I think pass on anything to elasticsearch at all. Which leads me to my second thing. Cant figure out how to use kibana to help debug anything. I have no idea if owncloud is trying to send any data to elastic. It currently just says "0 nodes marked as indexed, 0 documents in index using 225 bytes".

Here is my compose file. Kind of a hodge podge mix of things from the web to get it to even start 😅.
docker-compose.yml
.env

 

Not associated with r/takeaplantleaveaplant but it's the same idea.

Lemmy: takeaplantleaveaplant / [email protected]

Kbin: @takeaplantleaveaplant

 

'Salvage' from reddit
Original post:

For those of us with the 64 gig model, having a small home partition can be an issue from time to time. While you can symlink compatdata and shadercache to the sdcard, flatpaks are more difficult and you may have other files taking up space too.

Disclaimer

I am not responsible for you borking your deck or losing any data. This is not for everyone, and if you can get away with symlinking I highly recommend doing that instead. You will not be able to swap out the sdcard without data loss after this process.

Prerequisites

  1. You must have a btrfs formatted home partition. You can find how to do this [here](https://gitlab.com/popsulfr/steam os-btrfs)
  2. You must back up any existing data on your sdcard to an external device or an off site location such as cloud storage. This can take a long time depending on how much data you have.
  3. (optional) Back up your /home partition to an external device or off site location such as cloud storage.
  4. You must have a sudo password.

Using your sdcard as Adoptable Storage

There may be a way to do this more directly without data loss, but I will go with my current solution based on my somewhat limited knowledge of the btrfs filesystem.

How to

  1. Make sure your home partition is btrfs and that all the data you care about is backed up.
  2. Open konsole and enter a su shell: sudo -s
  3. Unmount your sdcard using mount /run/media/mmcblk0p1
  4. If your sdcard is not already formatted btrfs from SteamOS BTRFS, do mkfs.btrfs /dev/mmcblk0p1
  5. Add the sdcard to the home volume: btrfs device add -f /dev/mmcblk0p1 /home
  6. (optional) Balance the filesystem: btrfs filesystem balance start /home. This can take a VERY long time.
  7. Reboot and go back to desktop mode.

Restoring data

I will go through the steam folder, but for other files you can pretty much drag and drop.

  1. Copy all the data you backed up to /home/deck
  2. Enter your steam library folder that was originally on your sdcard.
  3. Merge the steamapps folder with your main steam library folder (usually /home/deck/.steam/steam/steamapps)
  4. Go to your steam library and click install on all the games that were on your sdcard. Steam should verify them and then let you play. If you know a way to merge the .vdf file please mention it.

Check your swap file

In my case, the swap file was not working, you may have to create a new one.

  1. Run the command free -m.
  2. If the Swap row doesn't say 0 0 0 your all good. If it does, continue on.

No swap file

Note: I was not able to use the existing @swapfile subvolume from SteamOS BTRFS for some reason, so I will show the process of making a new one.

  1. Open konsole and enter a su shell: sudo -s
  2. Create the new subvolume: btrfs subvolume create /home/@swap.
  3. Set the attributes to allow swap: chatter -R +C /home/@swap.
  4. Create the swapfile with truncate -s 0 /home/@swap/swapfile
  5. Allocate storage to the swap file. You can allocate as much as you'd like. I tend to do a lot of memmory heavy things, so I will make it 16 gigs: fallocate -l 16384M /home/@swap/swapfile.
  6. Set the permissions for it: chmod 600 /home/@swap/swapfile.
  7. Make it a swap file: mkswap /home/@swap/swapfile
  8. Add the following line to /etc/fstab and save it: /home/@swap/swapfile swap swap defaults 0 0
  9. Enable swap: swapon /home/@swap/swapfile
  10. Run free -m to check if it worked.
  11. Reboot and run free -m in konsole again to make sure it stuck.

Leaving extra unallocated space

Btrfs is bad at handling running out of space! Like, really bad! It basically makes its self read only with no way out. To avoid this you should leave a few gigs of unallocated space.

Setting up pacman

If you already have pacman set up skip this.

  1. Open konsole and disable the read only filesystem: sudo steamos-readonly disable
  2. Run sudo pacman-key --init
  3. Populate the keyring: sudo pacman-key --populate archlinux
  4. Update with sudo pacman -Syu

Gparted

  1. Install gparted: sudo pacman -S gparted
  2. Run gparted and give it your sudo password.
  3. Select the physical device you want to shrink the partition on.
  4. If possible prevent anything from writing to the disk to avoid corruption.
  5. Right click the allocated space and hit resize/move.
  6. Add to Free Space Following (MiB). I did 3 gigabytes, so that would be 3072.

"Help I ran out of space!"

If you have done this process, just expand the btrfs partition with gparted, delete some files, and run sudo btrfs balance start /home in konsole and wait. Wait a long time. Probably several hours.

After that re-create the unallocated space in case it happens again.

Done!

You now should have a home partition that spans across multiple devices and a working swap file.

P.S. I'm writing this because I have a migraine and can't do anything, but I'm bored. So if there are any mistakes please yell at me in the comments 😂

3
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

'Slavage' from reddit.
Original post:

What is apx?

Apx (/à·peks/) is Vanilla OS' package manager. Since Vanilla OS is for the most part an immutable operating system, it can't write packages to the root partition. It creates and manages containers using distrobox and podman (or docker) for various other package managers such as yay, apt, dnf, zypper, and several others. Any package installed in a container can easily be exposed to the host os.

Why would I want the is?

Apx is intended to avoid package conflicts and have all the possible options for places to get packages from. Say you wanted to build a software that is easier to build on Ubuntu than arch, you could run apx install <insert packages here> to install the dependencies without having to find the arch package names or have any conflicting packages. These packages will also not take up space on the fairly-limited-in-size root partition.

Prerequisites

  1. Have pacman set up.
  2. Disabled read only filesystem
  3. An internet connection.
  4. Desktop mode.

I recommend you read through all this and use the help of the internet to figure out what any of these commands do you aren't sure about.

How to

  1. In a konsole window, install the necessary packages: sudo pacman -Syu base-devel holo-rel/linux-headers linux-neptune-headers holo-rel/linux-lts-headers git glibc gcc gcc-libs fakeroot linux-api-headers libarchive go podman
  2. Clone the apx repo and enter the directory it downloads: git clone --recursive https://github.com/Vanilla-OS/apx.git && cd apx
  3. Build apx with make build
  4. Then install it: sudo make install

Fixing podman

At this point it will not work as podman doesn't work out of the box. Run the following commands to fix it.

sudo touch /etc/subuid

sudo usermod --add-subuids 10000-75535 deck

sudo touch /etc/subgid

sudo usermod --add-subgids 10000-75535 deck  

Allowing graphical applications

(skip if you don't need em)

  1. Run kate /etc/systemd/system/apxgui.service
  2. Paste in the following:
[Unit]
Description=Run xhost + on boot

[Service]  
Type=simple
ExecStart=xhost +  

[Install]  
WantedBy=default.target  
WantedBy=graphical-session.target  

  1. Run sudo systemctl enable apxgui to enable the service.
  2. Run sudo systemctl start apxgui to start the service.
  3. run xhost +.

Using apx

Run apx to see a list of arguments and options.

To install packages run apx install <package name>. By default it uses the apt container, but you can make it use any other container by using an argument. For example, apx install --xbps neofetch will install neofetch from the Void Linux repository.

You can run programs from specific containers the same way. apx run --xbps neofetch will run the version of neofetch you have installed in the xbps container.

Entering a container's environment is the same: apx enter --yay. You can then use it as if you were using that OS.

If it's not working, that means I probably missed something. Please let me know. I had to noodle through this and looked at my bash history after, so there is a chance I might have missed a required package or something.

view more: next ›