Hi, Once in a while I try to clean up my tabs. First thing I do is use “merge all windows” to put all tabs into one window.

This often causes a memory clog and firefox get stuck in this state for 10-20 minutes

I have recorded one such instance.

I have tried using the “discard all tabs” addon, unfortunately, it is also getting frozen by the memory clog.

Sometimes I will just reboot my PC as that is faster.

Unfortunately, killing firefox this way, does not save the new tab order, so when I start firefox again, it will have 20+ windows open, which I again, merge all pages and then it clogs again !

So far the only solution I have found is just wait the 20 minutes.

Once the “memory clog” is passed, it runs just fine.

I would like better control over tab discard. and maybe some way of limitting bloat. For instance, I would rather keep a lower number of undiscarded youtube that as they seem to be insanely bloated.

In other cases, for most website I would like to never discard the contents.

In my ideal world, I would like the tabs to get frozen and saved to disk permanently, rather than assuming discard tabs can be reloaded. As if the websites were going to exist forever and discarding a tab is like cleaning a cache.

  • optissima@possumpat.io
    link
    fedilink
    arrow-up
    9
    ·
    5 months ago

    What I’d recommend, based on the insistence that seeing to not change your workflow, is to locally download the pages you have open with httrack, wget or a similar application. This would allow you to locally search all your tabs and their contents very quickly without Google, they will load faster because of lack of needing to redownload them, which if I understand correctly Firefox is trying to do at some level.

    • interdimensionalmeme@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      5 months ago

      Thanks, I didn’t know that one.

      I have been experiementing with a transparent proxy like squid or something like Archive Box, to create static pages on the fly and load that.

      But so far I’ve not made something seamless and pleasant to use. It would have to be at least as low friction as using google.

      I am going to try using Mixtral 8x7b to perform natural language search over my archives and pull tabs from the collection of all pages I have ever seen. But that’s still a long way away from being operational !

      • optissima@possumpat.io
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        …has Google still been giving you the same results recently? This is an extremely weak link in your setup to me. You’d be better off looking at a locally run search engine like peARs or something similar with locally downloaded and indexed files if you insist on using search, and it’ll be waaaay more reliable than an LLM here.

        • interdimensionalmeme@lemmy.mlOP
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Google is giving me increasingly poor results, I am looking into deploying Searxng locally.

          I really would like to operate my own local crawler and sorting algorithm.

          I will check out the peARs you mentionned !