Java.io.StreamTokenizer.ordinaryChar()方法实例
java.io.StreamTokenizer.ordinaryChar(int ch) 方法指定的字符的参数是“ordinary”这个词器。它消除了任何特殊意义的字符有一个注释字符,单词组成,字符串分隔符,空格或数字字符。当这样的性格是由解析器遇到,解析器将其视为一个单字符记号,并设置ttype字段的字符值。制作一个行结束字符“普通”可能会干扰一个StreamTokenizer中的数行的能力。lineno方法可能不再反映在它的行数,例如终止字符的存在。
声明
以下是java.io.StreamTokenizer.ordinaryChar()方法的声明
public void ordinaryChar(int ch)
参数
-
ch -- 字符
返回值
这个方法没有返回值
异常
-
NA
例子
下面的示例演示java.io.StreamTokenizer.ordinaryChar()方法的用法。
package com.yiibai; import java.io.*; public class StreamTokenizerDemo { public static void main(String[] args) { String text = "Hello. This is a text that will be split " + "into tokens. 1+1=2"; try { // create a new file with an ObjectOutputStream FileOutputStream out = new FileOutputStream("test.txt"); ObjectOutputStream oout = new ObjectOutputStream(out); // write something in the file oout.writeUTF(text); oout.flush(); // create an ObjectInputStream for the file we created before ObjectInputStream ois = new ObjectInputStream(new FileInputStream("test.txt")); // create a new tokenizer Reader r = new BufferedReader(new InputStreamReader(ois)); StreamTokenizer st = new StreamTokenizer(r); // set as an ordinary char st.ordinaryChar(' '); // print the stream tokens boolean eof = false; do { int token = st.nextToken(); switch (token) { case StreamTokenizer.TT_EOF: System.out.println("End of File encountered."); eof = true; break; case StreamTokenizer.TT_EOL: System.out.println("End of Line encountered."); break; case StreamTokenizer.TT_WORD: System.out.println("Word: " + st.sval); break; case StreamTokenizer.TT_NUMBER: System.out.println("Number: " + st.nval); break; default: System.out.println((char) token + " encountered."); if (token == '!') { eof = true; } } } while (!eof); } catch (Exception ex) { ex.printStackTrace(); } } }
让我们编译和运行上面的程序,这将产生以下结果:
Word: Hello. Word: This Word: is Word: a Word: text End of Line encountered. Word: that Word: will Word: be Word: split Word: into Word: tokens. Number: 1.0 + encountered. Number: 1.0 = encountered. Number: 2.0 End of File encountered.