Linux uses memory, just like Windows and every other operating system. Linux uses spare memory to improve performance, just like every other operating system. Your requirements depend on what you do. Here are guidelines based on real life usage.
Lets step through different uses from the smallest to the largest.
You might use a netbook when out walking in the bush. Netbooks have limited memory, slow processors, slow disks, and small screens. Netbooks are good for checking email and sending photographs but that small screen makes image editing, Web development, and a whole lot of other things impractical. Can you run Linux successfully and what is the memory requirement?
I ran Ubuntu Linux for a year on a small slow netbook. I also kept a high power desktop computer for image editing and Web development. The netbook had a GigaByte of memory, 1000 MegaBytes. Linux used about 150 MB on start up then the memory crept up to 300 MB with various background services loaded and a basic Web browser, in this case Firefox with few add-on modules.
There was plenty of memory for reading email and a number of other things but not with them all loaded at once. Linux really slowed down when memory usage went above 700 MB. If I started all the same applications I used on the desktop, memory usage would quickly reach up near 700 MB then creep up past 800 MB when some of the applications had files open. The result was painful. I uninstalled some applications, particularly anything using Java, and memory usage went down.
Part of the problem from installing applications then uninstalling them was a load of junk left behind. When I uninstalled a set of big applications, there was close to 100 MB more memory used than before their installation.
For a netbook I suggest installing only what you need and monitor the memory used when you start each application. Stop things starting automatically if you do not need them all the time. Aim for memory usage at less than half the total memory when you have all the applications started. They will quickly add another 20% to 30% when they open a few files.
For my use, I would pay a little bit extra for a faster processor and a little bit extra for a faster disk, preferably one of the better SSDs but not worry about the memory usage because the models with the faster processors usually have extra memory. I would install just one thing at a time and check the usage after each installation.
Some ultrabooks have 13" screens and 2 GB of memory. Many have 4 GB of memory and 4 GB is more than you need. I have some 4 GB ultrabooks and desktop machines running Linux with heaps of applications installed including Apache, PHP, MySQL, GIMP, the bloated Openoffice, and about 60 applications I have installed just to see what they do. Memory usage starts at 0.4 GB and quickly climbs close to 1 GB. During heavy use of many applications, the memory passes 1.5 GB. If I had a 2 GB machine, I would see Linux slowing down. 4 GB gives me so much extra memory that Linux never slows down due to memory usage. get at least 4 GB.
The last time I built a desktop, I was going to fit it with 64 GB of memory because memory is so cheap. I ended up fitting only 16 GB because I could use a lower powered processor/motherboard which let me use a nearly silent fan. Quiet is a higher priority than speed.
The 16 GB machine reaches 1.7 GB of memory usage with normal use. What have all that extra memory? Occasionally I scan all the files in a storage array and the disk directories can use a couple of GB of memory. Your operating system works faster when all the directories can stay in memory. the first scan will be slow then subsequent scans will be faster.
Windows with NTFS is very good at that type of file system speedup. Basic Linux with the standard Ext4 file system is not as good. There may be setting in Linux to improve the situation. You can also create and start search in the background to preread the directories before you need them. In Ubuntu you could open Places then select the file system then start a search for a file that does not exist. You can then do whatever else you need to do while the search is dragging all those disk directories into memory.
Memory is cheap in a desktop computer, unless you are buying one of those brands where they try to steal all your money if you select any options, and memory uses very little power. Get the maximum memory for your motherboard.
Servers are always limited by something. Your aim is to make the server fast enough to be limited only by the network.
Linux based servers usually have the minimum software installed and use very little memory for the operating system. The file system will work best with most of the directories in memory and that is usually about one percent of the disk size. If your disk storage is 1 Terabyte, or 1000 GB, the directories will use about 10 GB of memory. You can expand out from there.
You will not use all the disks all the time. If you have a 5 TB array, there might be only 1 or 2 TB of active files.
Web servers might use 20 MB per page delivered to the browser when the pages are not cached in a proxy server. You might get 10000 page reads per minute with 8500 cached, leaving 1500 using 20 MB each or 30 GB in total. You want enough memory in your server to handle the 30 GB of Web pages plus the 20 GB of disk directories plus the 0.15 GB of Linux operating system.
You can see how the maths is heading. Memory usage in a server depends on what is using the memory and how active the server is. For Web servers, the peak activity might be from 10 times to 1000 times the average usage. You need to start with far more memory than you expect then monitor the peak usage.
When you are looking at something larger than 64 GB, you are probably looking at a machine where the network connection is close to saturation and an expansion of memory will not help. Every time you expand memory, check the network delays and look for bottlenecks caused by inadequate network speed.
At the other end, look for slow disk activity and prepare to move to faster disks.