Forum Groups
  All forums
    Help & Feedback
      Work in progress
      Finished Art
      Non-Max related

Maxunderground news unavailable

Can MaxScript handle extreamly large text files?
show user profile  Nanne
We have a software at work that spits out a lot of data as raw text, but the data is in the wrong order for 3ds Max, so I need to write a MaxScript that reads each line of this text file and changes the order of the data and saves it out in a new text file (.csv). Can this be done with MaxScript or will it try to load the whole file into memory at once and crash the system or something like that? The text files can be several gigabytes in file size (Notepad can't open it, but Notepad++ can). Will MaxScript handle it, or is it better to create such a script in Python or C++ instead?

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 469 times
5/29/2016 6:14:11 PM (last edit: 5/29/2016 6:14:11 PM)
show user profile  Garp
No idea.
For this kind of work that doesn't need to be done from within the app, I'd go for C++ regardless.

read 464 times
5/29/2016 6:30:23 PM (last edit: 5/29/2016 6:30:23 PM)
show user profile  Nanne
But such a hazzle for me to figure out how to write this in C :P
I'll give it a try in MaxScript, just for fun, I need to be better at MaxScript anyway :)

If it doesn't work, I have a answer for my own question and I'll share it with you all :) But I'll probably fall back to doing it in C# as plan B, since I have more experience with .NET.

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 449 times
5/29/2016 9:45:45 PM (last edit: 5/29/2016 9:46:03 PM)
show user profile  Garp
You could be fancy and use dotNET from mxs. Something like
data = (dotnetclass "System.IO.File").ReadAllLines(@"C:\Users\Nanne\Desktop\MyBigFile.txt")
see if it loads.

read 439 times
5/29/2016 11:52:58 PM (last edit: 5/29/2016 11:53:16 PM)
show user profile  Nanne
Ow, thanks! :) Yeah, I'll give that a try in case MaxScript can't handle it on its own. On the other hand .NET probably have better string manipulation methods then MaxScript, so that might be helpful in reordering the data.

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 437 times
5/29/2016 11:56:18 PM (last edit: 5/29/2016 11:56:18 PM)
show user profile  jpedleham
Ive experienced max locking up quite severely with massive text files. sometimes even notepad struggles. I too would do it in a .net language. I ended up coding a little visual basic program to do mine as max would lock up for ages.
read 409 times
5/30/2016 11:16:42 PM (last edit: 5/30/2016 11:16:42 PM)
show user profile  br0t
MAXScript has all the functionality to read files in a stream as opposed to all-at-once (readLine etc.). No idea how the speed compares to other languages though, internally it will be implemented in C++ but it may still take way longer (but then you may just let it run for an hour and be good). As long as you don't aggregate everything that you read, you should be fine in regards to memory consumption.

Reordering several GB large files may be tricky. Not sure how to best approach it, maybe you just read a chunk, then extract things into several smaller files that e.g. contain entries of a certain category and later you just write-append them into a single file?

I'd like to hear what you come up with!

never get low & slow & out of ideas

read 391 times
5/31/2016 11:47:52 AM (last edit: 5/31/2016 11:47:52 AM)
show user profile  br0t
Okay I did a quick test to read a text file that is roughly one gigabyte line-by-line.

MAXScript: 125 seconds
Python: 15 seconds

Don't use MAXScript if you don't have to :D

never get low & slow & out of ideas

read 390 times
5/31/2016 11:54:46 AM (last edit: 5/31/2016 11:55:52 AM)
show user profile  Nanne
I'm playing around with this now. Here's my Maxscript.

--script begins
iFile = openFile "S:\cfd.csv"
oFile = createFile "S:\mycfd.csv"

startFound = false

while not EOF iFile do
arr = FilterString (readline iFile) ","
if arr[1] == "Node" do startFound = true
if arr[1] == "" do startFound = false

if startFound do
format "%" (arr as string) to:oFile

close iFile
close oFile
--script ends

And it is suuuuuuuper slow! :P This file is only 5MB (40662 lines) in size, but it's taking ages to write the new file! :P It's adding about 2KB to the file size every second. I'm refreshing the Windows Explorer with F5 while Max is working on it.

It is very memory efficient though, there is almost no memory footprint at all. But when I tried loading a 900MB file with:
data = (dotnetclass "System.IO.File").ReadAllLine (@"C:\Users\Nanne\Desktop\MyBigFile.txt")
Max used more then 4GB of RAM extra just to load that text file.

I'll have to look into another method. I'll try using .NET within Maxscript, see it that's faster.

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 324 times
6/11/2016 5:17:38 PM (last edit: 6/11/2016 5:24:37 PM)
show user profile  Nanne
Here is my new code, using "System.IO.Streamreader" from .NET. It's nice because it only reads one line at a time, and hardly doesn't take up any memory at all. However, it still causes Max to freeze (Not responding) even though the script continues to run in the background and eventually finishes, so it is working. However I would like to have a progressbar in the UI to show the progress. So I first read the file just to count the number of lines (the "Scan file" button) then I start the real process (the "Process file" button) that reads line by line, rearrange the data in correct order in the line, and then writes the new line into a new file. But it also calculates a percentage of the process, based on the current line number compared to the previous counted total number of lines. I've tried on a file with around 18000 lines (~1MB), and that works, the progressbar keeps updating smoothly, but on larger files with like 108000 lines (~5MB) it works in the beginning, but eventually the Max UI freezes and the progressbar does not update, but the script is still working in the background. But the file I really need to work with is 900MB and contains millions of lines, and it takes several minutes to rearrange the data, so I would really like to have a progressbar! :)

Is there a way to get this to work better? Multithreading? :)

Here is the new script!

--script begins
rollout progressTest "Progress Test"
local readpath = "D:\OneDrive\Projects\Work\CFD\SimpleFile3.txt"
local writepath = "D:\OneDrive\Projects\Work\CFD\SimpleFile2.txt"
local numLines
button calculate "Scan file"
label fileinfo "Lines in file" style_sunkenedge:true width:180 height:16
button doit "Process file"
progressbar doit_prog color:blue
on calculate pressed do
--Count lines in file
sr = dotnetobject "System.IO.Streamreader" readpath
numLines = 0
while not sr.EndOfStream do
numLines = numLines + 1
fileinfo.text = append (numLines as string) " lines"
on doit pressed do -- when the button is pressed...
sr = dotnetobject "System.IO.Streamreader" readpath
sw = dotnetobject "System.IO.StreamWriter" writepath

--Start conversion
startFound = false
start = timeStamp()
sw.WriteLine("Node,Global Id,X,Y,Z,Vx Vel,Vy Vel,Vz Vel,Press,Temp,Scal1,TurbK,TurbD,HeatFlux,Shear Mag,NVol,Density,Mach")
currLine = 0
while not sr.EndOfStream do
arr = FilterString (sr.ReadLine()) ","
if arr[1] == "Node Number" do
startFound = true
if arr.count == 0 and startFound == true do startFound = false

if startFound do
newArr = #()
newArr[1] = arr[1]
newArr[2] = ((arr[1] as integer) + 1) as string
newArr[3] = arr[2]
newArr[4] = arr[3]
newArr[5] = arr[4]
newArr[6] = arr[12]
newArr[7] = arr[13]
newArr[8] = arr[14]
newArr[9] = arr[6]
newArr[10] = arr[7]
newArr[11] = 1.0 as string
newArr[12] = arr[9]
newArr[13] = arr[8]
newArr[14] = arr[10]
newArr[15] = arr[11]
newArr[16] = arr[15]
newArr[17] = arr[5]
newArr[18] = 0 as string

str = ""
for i = 1 to newArr.count do
str = append str newArr[i]
if i != newArr.count do str = append str ","
doit_prog.value = 100.*currLine/numLines
currLine = currLine + 1
end = timeStamp()
format "Processing took % seconds\n" ((end - start) / 1000.0)
createDialog progressTest 200 120 -- create a dialog to test
--script ends

If you want to test the script the data should look like this:

Node Number,Global Id,X,Y,Z,Vx Vel,Vy Vel,Vz Vel,Press,Temp,Scal1,TurbK,TurbD,HeatFlux,Shear Mag,NVol,Density,Mach

Make sure that is written on two lines. Then just duplicate the last line any number of times dependent on how large test data you want to try on. The "Node Number," is important, because that's what tells the script to start gathering and rearranging the data.

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 296 times
6/12/2016 6:47:34 PM (last edit: 6/12/2016 6:52:00 PM)
show user profile  Dub.
Add this line every 1000 lines or something


That will mean that max's UI keeps updating.

read 275 times
6/14/2016 1:45:09 AM (last edit: 6/14/2016 1:45:09 AM)
show user profile  Nanne
Thanks Dub! That was a neat trick, it works well :)

Kristoffer Helander
  :: Works ::   |  :: Resumé ::   |  :: Contact ::   |  :: CG Blog ::   |  :: Links ::     
read 259 times
6/15/2016 4:59:56 PM (last edit: 6/15/2016 4:59:56 PM)
#Maxforums IRC
Open chat window