Hi all,
I am looking at building my next NAS. My current will move to offsite, and the new will be primary. I previously used this motherboard, and was planning to go with that again. Then I saw this one, which seems like a better option. It has a slightly better CPU and a PCIe slot, but can only have 32GB memory max compared with 64GB max on my current.
Am I missing anything or is this a no-brainer to switch to the N100 board?
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
May I ask why this mobo comes with CPU? I guess CPU can be replaced, but I cant find that one sold independently. I have no experience with this kind of hardware, but seems like it could be cheaper to buy MBO and CPU from local store. But again, im used to desktop components only (kind of), this one could be better value for $
@rambos @monty33 The N series processors are for embedded applications and are soldered to the motherboard. look up #Erying motherboards, they use more powerful laptop CPUs (also soldered) for not much more than the N series.
I struggle to find a motherboard and separate CPU that gives me better value that this. Neither of these are perfect but I think that offer plenty for my use as a NAS.
Ah ok that makes sense, thx for explaining
The N100 only has 9 gen 3 PCI lanes. The board has 4 USB ports (2 x 3.0), 4 2.5G ethernet ports, 2 m.2 slots and 6 SATA connections. They might be using a PCI splitter chip to connect all components, which depending on the type and how it’s used could have a big effect on I/O performance.
EDIT: The N5015 from your previous board only has 8 lanes. What was your max read/write speed on it and did you see anything strange in
lspci
?What is the best way to check max read/write speed? Here is the output from lspci. Do you see anything strange?
lspci
looks fine, I don’t see anything strange.I usually just use
dd
to check write speeds.I’m running my NAS on a 12 year old motherboard with 16gb of ram the max the board supports. Though I wish I could bump this up now after running this system for 9 years.
I would recommend having a board with at least a PCIe slot so if you ever need more drives you can plug them all into a HBA Card. My board has 3 and I use 2 of them at the moment. One for the HBA card that supports 24 drives and another for a 10gb NIC.
The third I would probably use to add another HBA card if I expand drive quantities.
I’m very happy with my current N5105 board that I linked, even though it doesn’t have any PCI slots. With 6 SATA and two M.2s I should be ok. If necessary I can add a 6 SATA M.2 adapter to get up to 12, which is definitely more than I need. If/when I get 10gbe I would have to upgrade either of these to get those speeds. Although a M.2 PCI riser with a 10gbe card would get me ~7gb so that is also an option.
Any thoughts on these two boards? I don’t see any real disadvantages to the N100 board when compared to the N5105.
Seems like the N100 is your option if you are only choosing between these two. Personally I am in the same both as others here, where desktop hardware is my preference at the moment especially if I can find combo deals for mombo/cpu.
Though my recommendation is to consider a board that would support PCIe for a potential LSI HBA card, stay away from any other sata expansion cards unless you don’t value your data.
If you do ever pick up a LSI HBA card with support for either 8/12/24 drives I would also state to plug the whole pool into this card and not mix and match between onboard SATA connections and the card.
A boot drive can still connect to a SATA connection on the board as it not part of the pool.
I wouldn’t buy hardware from Aliexpress as it has a bit of a reputation.
Anyway I would go with the N100 as it supports 3200MT/s memory.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
5 acronyms in this thread; the most compressed thread commented on today has 9 acronyms.
[Thread #497 for this sub, first seen 9th Feb 2024, 05:45] [FAQ] [Full list] [Contact] [Source code]
Why do you need so much ram in your NAS?
ZFS. It can use up as much RAM as you care to give it for caching. So if you are slinging a lot of data back and forth, more RAM is better. Especially if you are using HDDs instead of SSDs.
And bumping up the RAM for caching makes a HUGE difference in performance on a RAM starved system. Going from 16 to 32 gigs almost doubled my read write performance for anything other than tiny files here and there. And overall I/O latency tanked.
Why do you need to cache data? To seed a lot of torrents?
It’s a function of ZFS itself. Data that is to be written to the drives is first written to RAM, then transferred to the drives. One of the benefits of this is that if you are moving a file that is smaller than the available RAM, your transfer won’t appear to be limited to the write speed of the drives.
What software are you planning on running? The N100 is a pretty stout chip for my router. Not all file systems need insane amounts of ram like ZFS
That said you may want a full sized pcie slot depending on what hard drives you’re running. At work our raid controllers are gimped by their PCIe interface. Even with regular ass hard drives they will out pace pcie 2.0@x1 speeds since most cheap HBAs are PCIe 2 or maybe 3.
I will be running Truenas scale with ZFS, so will be installing 32GB of memory.
I realize a full size PCI slot would be better, but not really necessary for my application. Both have 6 SATA and two M.2s. I could put a 6 SATA/M.2 adapter in when I need more HDs. I’m really just comparing these two, and I don’t really see any disadvantages to the N100 board.
On the N100, they say the max support is 16GB RAM. If you were looking for at least 32GB, it may not be an option anymore
It is listed at 32GB, but I’ve seen folks online who have installed more and tested it. Same 16GB limit listed with the N5105 that I currently have. I have 32GB on that and tested it.
Ok, this is good to know!