I can’t say for sure- but, there is a good chance I might have a problem.

The main picture attached to this post, is a pair of dual bifurcation cards, each with a pair of Samsung PM963 1T enterprise NVMes.

It is going into my r730XD. Which… is getting pretty full. This will fill up the last empty PCIe slots.

But, knock on wood, My r730XD supports bifurcation! LOTS of Bifurcation.

As a result, it now has more HDDs, and NVMes then I can count.

What’s the problem you ask? Well. That is just one of the many servers I have laying around here, all completely filled with NVMe and SATA SSDs…

Figured I would share. Seeing a bunch of SSDs is always a pretty sight.

And- as of two hours ago, my particular lemmy instance was migrated to these new NVMes completely transparently too.

krolden
link
fedilink
English
31Y

Having a large flash pool really makes your life so much better.

Until you fill up all your space and have to buy more :p

HTTP_404_NotFound
creator
link
fedilink
English
21Y

Hopefully that doesn’t happen soon! I don’t have too much room for more flash, lol.

But, I have quite a bit of available space, so, there shouldn’t be any concerns. Also- tomorrow, after a few adapters arrives, I’ll be adding another 2x 1T flash drives my Optiplex 5060 SFF.

@Decronym@lemmy.decronym.xyz
bot account
link
fedilink
English
89
edit-2
1Y

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

Fewer Letters More Letters
NVMe Non-Volatile Memory Express interface for mass storage
PCIe Peripheral Component Interconnect Express
SATA Serial AT Attachment interface for mass storage
SSD Solid State Drive mass storage

4 acronyms in this thread; the most compressed thread commented on today has 3 acronyms.

[Thread #13 for this sub, first seen 8th Aug 2023, 21:55] [FAQ] [Full list] [Contact] [Source code]

Good bot

Good bot

Fantastic bot, honestly.

Well this seems to be a good problem to have hahah. If you need to get rid of some of those ssds count with me.

HTTP_404_NotFound
creator
link
fedilink
English
41Y

ebay! You can pick up these “used” enterprise NVMe and SSDs for CHEAP. All 10 arrived with less than 5% wear.

What software are you running on all of this?

HTTP_404_NotFound
creator
link
fedilink
English
21Y

From https://lemmyonline.com/comment/768355

A bit of everything. Publicly facing websites. Lemmyonline.com. A few popular discord bots.

Linux ISO collection and streaming.

Lots of automation.

Lots of things around software development. Lots of things around systems and network administration.

Some kubernetes too.

A bit of everything, and nothing in particular.

What management interface is that though and is it part of the OS? What OS are you using anyway?

GreyBeard
link
fedilink
English
21Y

The first screenshot is of Dells built in system tools for servers. Being a Dell server he should have Dell’s iDRAC, which is a lights out management module. It is really fantastic.

I wasn’t talking about that. I was talking about the second screenshot. Thanks anyway

HTTP_404_NotFound
creator
link
fedilink
English
21Y

The bottom screenshot, is from proxmox, which is the top-level OS in play.

Ah okay. Maybe I should try that at some point. It’s been years since I used it last.

HTTP_404_NotFound
creator
link
fedilink
English
11Y

I honestly just started using it again a week or two ago. I have been extremely pleased with the features it offers.

The only problem I see is using 8x slots instead of 16x slots for double the storage

HTTP_404_NotFound
creator
link
fedilink
English
6
edit-2
1Y

Whats the problem?

Each NVMe uses 4 lanes. For each of these x8 slots, they have two NVMes, for a total of 8 lanes.

The x16 slot already has 4x NVMe in it, lol. The other x16 slot has a GPU, which is located in that particular slot due to the lovely 3d-printed fan shroud.

One of the other full-height x8 slots also has a PLX switch, and is loaded with 4 more NVMes.

@feitingen@lemmy.world
link
fedilink
English
11Y

Does the plx introduce noticeable latency, and does it get hot?

I want to get a few, but I don’t really have the airflow you do, so I’m a bit worried.

HTTP_404_NotFound
creator
link
fedilink
English
11Y

I have not noticed any issues with it.

And- prior to Jan of this year, I used two of them in an r720xd because it didn’t support bifurcation. And- can’t say I ran into any issues.

I also, have not checked to see if it was hot either though.

@maxprime@lemmy.ml
link
fedilink
English
91Y

If that’s a problem then I don’t want to be solved.

HTTP_404_NotFound
creator
link
fedilink
English
21Y

Its only a problem when you get the electric bill! (Or the wife finds your ebay receipts)

I doubt these use much power compared to their spinning rust anticedents.

HTTP_404_NotFound
creator
link
fedilink
English
41Y

I meant my general electric bill. My server room averages 500-700watts.

@steeev@midwest.social
link
fedilink
English
21Y

Was curious how many watts this machine pulls? Also curious if you had ever filled it will spinning disks - would flash be less power hungry?

HTTP_404_NotFound
creator
link
fedilink
English
11Y

This one averages around 220-250.

It’s completely full of spinning disks. Flash would be less power usage, but, would cost significantly more, and would end up being drastically more expensive.

Scott
link
fedilink
English
21Y

Do you happen to have a link to those cards?

HTTP_404_NotFound
creator
link
fedilink
English
4
edit-2
1Y

Dual Slot Bifurcation Card Those are the ones I just picked up.

If you have a x16 slot, and can fit a full-height card, and use 4x4x4x4 bifurcation, the ASUS Hyper M.2 is really good.

Scott
link
fedilink
English
21Y

Sweet!

I’ve got a gen3 hyper M.2 but I was looking for something for the 8x slots in one of my servers without needing full height cards.

HTTP_404_NotFound
creator
link
fedilink
English
21Y

That’s the exact use case I got these for

SirNuke
link
fedilink
11Y

Do you have any trouble with cooling or anything with them? Got like a billion unused PCIe lanes in my Dell R730 and can think of a few things that might benefit from a big NVMe ZFS pool.

HTTP_404_NotFound
creator
link
fedilink
English
11Y

Generally, no.

I run a custom fan control script which keeps the fans around 30% minimum, but increases if needed.

Below 30% some things were getting toasty

@Millie@lemm.ee
link
fedilink
English
91Y

I dream of this kind of storage. I just added a second m.2 with a couple of TB on it and the space is lovely but I can already see I’ll fill it sooner than I’d like.

HTTP_404_NotFound
creator
link
fedilink
English
51Y

I will say, it’s nice not having to nickel and dime my storage.

But, the way I have things configured, redundancy takes up a huge chunk of the overall storage.

I have around 10x 1T NVMe and SATA SSDs in a ceph cluster. 60% storage overhead there.

Four of those 8T disks are in a ZFS Striped Mirror / Raid 10. 50% storage overhead.

The 4x 970 evo / evo plus drives are also in a striped mirror ZFS pool. 50% overhead.

But, still PLENTY of usable storage, and- highly available at that!

krolden
link
fedilink
English
1
edit-2
1Y

Any reason you went with a striped mirror instead of raidz5/6?

HTTP_404_NotFound
creator
link
fedilink
English
31Y

The two ZFS pools are only 4 devices. One pool is spinning rust, the other is all NVMe.

I don’t use raid 5 for large disks, and instead go for raid6/z2. Given z2 and striped mirrors both have 50% overhead with only 4 disks- striped mirrors has the advantage of being much faster, double the IOPs, and faster rebuilds. For these particular pools, performance was more important than overall disk space.

However, before all of these disks were moved from TrueNAS to Unraid- there was a 8x8T Z2 pool, which worked exceptionally well.

Cripes I was stoked I managed to upgrade from 4x 2tb to 4x 4tb recently.

I dont see any issues!

/me hides his 16 4tb 12g SAS drives…

I think I’m at 7x 18tb drives. I’m slowly replacing all the smaller 8tb disks in my server. Only 5 more to go. After that it’s a new server with more bays and/or a jbod shelf.

iesou
link
fedilink
English
11Y

That’s my next step. I have 8 8tb drives I need to start swapping, 2x512 NVMEs for system/app cache, and 1 2tb NVME for media cache.

platysalty
link
fedilink
51Y

I’ll gladly take those problems out of your hands for free

@Vake@lemmy.world
link
fedilink
English
21Y

Wondering what software you’re running to have all the storage managed and then your containers and things on top? Is it all on the 730XD?

The picture of the GUI at the end is Proxmox.
Proxmox is really powerful and great for a few servers.

@Rollio@lemmy.ml
link
fedilink
English
11Y

I don’t see any problem here…

HTTP_404_NotFound
creator
link
fedilink
English
11Y

There are no free PCIe slots left! That is a huge problem!

This does seem like an issue, I can help you free up some PCIe slots if you’d like

LoudWaterHombre
link
fedilink
English
61Y

Is your problem that you are bragging about your drives?

HTTP_404_NotFound
creator
link
fedilink
English
11Y

I’m out of room to add more drives!

Every one of my servers is basically completely full on disks. I need more servers.

LoudWaterHombre
link
fedilink
English
21Y

I need some drives

@joel@aussie.zone
link
fedilink
English
41Y

Love this. Apart from hosting an instance, what are you using it for? Self-cloud?

HTTP_404_NotFound
creator
link
fedilink
English
51Y

I host a few handfuls of websites, some discord bots.

I hoard Linux isos. I use it for general purpose learning and experimentation.

There is also kubernetes running, source control, and a bit of everything else.

Amateur data hoarder here; teach me your ways

HTTP_404_NotFound
creator
link
fedilink
English
11Y

Backups backups backups.

Anything you don’t want to lose, follow the 3.2.1. rule.

Snapshots / Raid are not backups.

Also, unraid is fantastic for handling bulk media. ZFS is fantastic for keeping things safe. (and fast).

And ceph is great for squeezing 20k IOPs out of 6 million IOPs worth of enterprise SSDs!

Create a post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

  • 1 user online
  • 124 users / day
  • 419 users / week
  • 1.16K users / month
  • 3.85K users / 6 months
  • 1 subscriber
  • 3.68K Posts
  • 74.2K Comments
  • Modlog