]]>

]]>

But first, the sanity check with linear regression! I was actually considering this before as well, so I figure it's worth a try. Though it appears that the way I've structured the problem, the closest equivalent to linear regression is actually to perform multivariate regression on a General Linear Model. I'm working on that right now, though the sheer amount of memory required to create something like 64800 x 64800 matrices of single precision floats is really taxing the computer, even though it has 32 GB RAM. Right now there's something like 118 GB split between the RAM and the virtual memory. XD

A plain vanilla NN isn't actually that different from the autoencoder implementation I'm using, other than the fact that the autoencoder is designed to deal with a large output vector, while something like a Multi-Layer Perceptron is designed with just a few classes as output nodes. Actually, I'm finding that what I've actually implemented is kind of a weird hybrid of an autoencoder and an MLP. Though if you mean I should try just implementing a Perceptron without a hidden layer and doing Logistic Regression on the data, I suppose I could try that too.

Yeah, at the very least this is good practice with Python libraries like Numpy and Theano, figuring out how to solve this problem within the constraints of limited memory resources. Though I realize that a big reason why I'm having memory issues is actually because of the way I designed the dataset. It has lots of zeroes where no earthquake happened on a given day. In theory there's probably a much more efficient way to encode this problem, but I wanted to see whether or not storing the location information structurally might make the task easier.

]]>

But first, the sanity check with linear regression! I was actually considering this before as well, so I figure it's worth a try. Though it appears that the way I've structured the problem, the closest equivalent to linear regression is actually to perform multivariate regression on a General Linear Model. I'm working on that right now, though the sheer amount of memory required to create something like 64800 x 64800 matrices of single precision floats is really taxing the computer, even though it has 32 GB RAM. Right now there's something like 118 GB split between the RAM and the virtual memory. XD

A plain vanilla NN isn't actually that different from the autoencoder implementation I'm using, other than the fact that the autoencoder is designed to deal with a large output vector, while something like a Multi-Layer Perceptron is designed with just a few classes as output nodes. Actually, I'm finding that what I've actually implemented is kind of a weird hybrid of an autoencoder and an MLP. Though if you mean I should try just implementing a Perceptron without a hidden layer and doing Logistic Regression on the data, I suppose I could try that too.

Yeah, at the very least this is good practice with Python libraries like Numpy and Theano, figuring out how to solve this problem within the constraints of limited memory resources. Though I realize that a big reason why I'm having memory issues is actually because of the way I designed the dataset. It has lots of zeroes where no earthquake happened on a given day. In theory there's probably a much more efficient way to encode this problem, but I wanted to see whether or not storing the location information structurally might make the task easier.

]]>