I am relatively sure there is an easy answer to this, but, after searching, I don't know what it is. within excel I want to simulate wins and losses separately based on the avg and std dev of both. for example: avg win: $300 std dev of wins: $200 avg loss: $200 std dev of losses: $150 first I determine if a trade is a win or loss then simulate the amount. my problem is that, using a normal distribution, I can come up with negative win amounts and positive loss amounts at times. I realize I probably shouldn't use a normal dist (leaving aside fat tails, etc) as the results don't truncate at $0 but I don't know what else to use. Should I use some other type of distribution or is there a mathematical/statistical technique that can be used to truncate a normal distr at 0? Hopefully this is clear and thanks in advance.

Lognormal distributions are used in many circumstances precisely BECAUSE they are strictly positive. If X is a Gaussian (i.e. "Normally Distributed") random variable, then Y=exp(X) is a Lognormally Distributed random variable. Mnemonic: the log of Y is Normal, so Y is said to be "lognormal". On the other hand, the thing most Monte Carlo simulation people do, is capture the actual win/loss values from the actual trades (either real money trades made in real life, or simulated trades from a backtest simulation). Then just use this empirical distribution, i.e. "drawing balls from an urn with replacement." You could even use this as a way to implement your truncated Gaussian idea: Generate a huge pile of winning trades from your winning Gaussian distribution, and generate another huge pile of losing trades from your losing Gaussian distribution. Now throw out the negative winners and the positive losers (truncate the Gaussians). Finally take (10,000 * winning%) of the winners and (10,000 * losing%) of the losers, throw them in the urn, and done.