@tha_rami
It's not too hard to parse out the file the way you have it but you can save yourself a few headaches and a lot of coding if you save your file from excel a different way. Like pcRaider said, human error is where problems can occur - but as long as your file is consistent, you should be ok.
When you enter your data in excel you probably are doing it like:
Infantry|A single soldier|8|S| etc.
across a row with each piece of info in it's own column. And then the next line and so on and so on. That's fine. What we want to do is get this into a format that DBC can read easily and you want to get it into an array. Well, you could write a parser function, but you don't have to as long as your data is consistant. By consistant I mean there is always the same number of columns in each row. Your example has 18 items per row. This can be as many items as you want, you just want to make sure that each row has the same number (that's because we are going to convert it to an array).
Now, once you have your table of data in your excel spread sheet, you want to put that in a format DBC can read as an array. Easy enough. Open another spread sheet and starting from the left most column on your data table, copy each column into a single column on the new spread sheet. So if your data looked like:
mech man 5 3
boat sail 4 0
the new data would look like:
mech
boat
man
sail
5
4
3
0
Now save this new data as DOS text. Not csv or tab delimited.
In DBC you have to dimension an array that is the size of your data table. Remember, the indexes of an array are one less that the number of columns and rows of your table.
In your example, there are 3 rows, and 18 columns. So, set up a string array in DBC that reads:
Dim array$(2,17)
Now all you have to do is load your file into the array:
Load array "ramis_file.txt",array$(2) (only use the first dimension)
That's it! I know I wrote a lot, but the whole process takes no time at all. You can even write a macro that does all the copying for you into the new excel sheet into one column. The key is that the array is the proper size relative to your data table.
If you still want to parse out your file, dimension your string array to the size of your data. Create counting variables for each dimension (if your array was array$(3,9) - you'd have two counting variables index1 and index2)
Start reading your file one byte at a time. If any byte value comes through not equal to 59 ';' 13 (return) or 10 (line feed) add that byte to a temporay string variable
if byte <> 59 and byte <> 13 and byte <> 10
temp$=temp$+chr$(byte)
endif
If the byte = 59, then set your array equal to the temp$, increment index2 and set temp$=""
array$(index1,index2)=temp$
inc index2
temp$=""
If the byte = 10, it's the end of a line so set your array equal to temp$, increment index1 and set index2=0. Also set temp$="".
Repeat this process until the end of the file and you should have an array filled with the info.
Enjoy your day.