HBase写HLog超时导致RegionServer退出

当hadoop的hdfs-site.xml中配置的dfs.socket.timeout的值比hbase中配置的大时, hbase在写入hlog时会报如下错误:

解决办法: 保证hadoop的hdfs-site.xml中配置的dfs.socket.timeout的值与HBase一致

10.9.141.165 RegionServer报错如下:

2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73253980java.net.SocketTimeoutException: 69000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:23420 remote=/10.9.141.165:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73253980 bad datanode[0] 10.9.141.165:50010
2013-04-15 01:05:49,476 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73253980 in pipeline 10.9.141.165:50010, 10.9.141.152:50010, 10.9.141.158:50010: bad datanode 10.9.141.165:50010
2013-04-15 01:06:55,633 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73262690java.net.SocketTimeoutException: 66000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:41078 remote=/10.9.141.152:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:06:55,634 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262690 bad datanode[0] 10.9.141.152:50010
2013-04-15 01:06:55,634 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262690 in pipeline 10.9.141.152:50010, 10.9.141.158:50010: bad datanode 10.9.141.152:50010
2013-04-15 01:07:58,716 WARN org.apache.hadoop.hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block blk_5280454841001477955_73262880java.net.SocketTimeoutExcept
ion: 63000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.9.141.165:48547 remote=/10.9.141.158:50010]

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
    at java.io.DataInputStream.readFully(DataInputStream.java:178)
    at java.io.DataInputStream.readLong(DataInputStream.java:399)
    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$ResponseProcessor.run(DFSClient.java:2514)

2013-04-15 01:07:58,718 WARN org.apache.hadoop.hdfs.DFSClient: Error Recovery for block blk_5280454841001477955_73262880 bad datanode[0] 10.9.141.158:50010

其中三台datanode报错如下:

10.9.141.152

2013-04-15 01:00:07,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.165:39523 dest: /10.9.141.152:50010

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while try

ing to read 65557 bytes

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 1 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.152:59490 remote=/10.9.141.158:50010]. 110927 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 1 : Thread is interrupted.

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 1 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,473 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,474 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.152:50010, storageID=DS736845143, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,479 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Client calls recoverBlock(block=blk_5280454841001477955_73253980, targets=[10.9.141.152:50010, 10.9.141.158:50010])

2013-04-15 01:05:49,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.165:41078 dest: /10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73262690 1 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.152:60943 remote=/10.9.141.158:50010]. 113932 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73262690 1 : Thread is interrupted.

2013-04-15 01:06:55,630 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 1 for block blk_5280454841001477955_73262690 terminating

2013-04-15 01:06:55,631 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262690 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:06:55,632 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.152:50010, storageID=DS736845143, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,556 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.165:41078 dest: /10.9.141.152:50010

2013-04-15 01:05:49,561 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

10.9.141.158

2013-04-15 01:00:07,420 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.152:59490 dest: /10.9.141.158:50010

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73253980 Interrupted.

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,495 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,495 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,578 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73253980(length=3121152), newblock=blk_5280454841001477955_73262690

(length=3121152), datanode=10.9.141.158:50010

2013-04-15 01:05:49,581 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262690 src: /10.9.141.152:60943 dest: /10.9.141.158:50010

2013-04-15 01:05:49,582 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262690

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262690 Interrupted.

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262690 terminating

2013-04-15 01:06:55,652 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262690 received exception java.io.EOFException: while trying to

read 65557 bytes

2013-04-15 01:06:55,652 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:06:55,655 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Client calls recoverBlock(block=blk_5280454841001477955_73262690, targets=[10.9.141.158:50010])

2013-04-15 01:06:55,666 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: oldblock=blk_5280454841001477955_73262690(length=3121152), newblock=blk_5280454841001477955_73262880(length=3121152), datanode=10.9.141.158:50010

2013-04-15 01:06:55,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73262880 src: /10.9.141.165:48547 dest: /10.9.141.158:50010

2013-04-15 01:06:55,669 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Reopen already-open Block for append blk_5280454841001477955_73262880

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262880 Interrupted.

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 0 for block blk_5280454841001477955_73262880 terminating

2013-04-15 01:07:58,735 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73262880 received exception java.io.EOFException: while trying to

read 65557 bytes

2013-04-15 01:07:58,735 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.158:50010, storageID=DS2062116090, infoPort=50075, ipcPort=50020):Dat

aXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)

10.9.141.165

2013-04-15 01:00:07,407 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block blk_5280454841001477955_73253980 src: /10.9.141.165:23420 dest: /10.9.141.165:50010

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in receiveBlock for block blk_5280454841001477955_73253980 java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 2 Exception java.io.InterruptedIOException: Interruped while waiting for IO on channel java.nio.channels.SocketChannel[connected local=/10.9.141.165:39523 remote=/10.9.141.152:50010]. 290930 millis timeout left.

    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(SocketIOWithTimeout.java:349)

    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:157)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)

    at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)

    at java.io.DataInputStream.readFully(DataInputStream.java:178)

    at java.io.DataInputStream.readLong(DataInputStream.java:399)

    at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PipelineAck.readFields(DataTransferProtocol.java:122)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver$PacketResponder.run(BlockReceiver.java:868)

    at java.lang.Thread.run(Thread.java:662)

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder blk_5280454841001477955_73253980 2 : Thread is interrupted.

2013-04-15 01:05:49,476 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder 2 for block blk_5280454841001477955_73253980 terminating

2013-04-15 01:05:49,477 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock blk_5280454841001477955_73253980 received exception java.io.EOFException: while trying to read 65557 bytes

2013-04-15 01:05:49,478 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(10.9.141.165:50010, storageID=DS-1327849832, infoPort=50075, ipcPort=50020):DataXceiver

java.io.EOFException: while trying to read 65557 bytes

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)

    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:377)

    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)

    at java.lang.Thread.run(Thread.java:662)
优秀的个人博客,低调大师

微信关注我们

原文链接:https://yq.aliyun.com/articles/110567

转载内容版权归作者及来源网站所有!

低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。

相关文章

发表评论

资源下载

更多资源
Apache Tomcat7、8、9(Java Web服务器)

Apache Tomcat7、8、9(Java Web服务器)

Tomcat是Apache 软件基金会(Apache Software Foundation)的Jakarta 项目中的一个核心项目,由Apache、Sun 和其他一些公司及个人共同开发而成。因为Tomcat 技术先进、性能稳定,而且免费,因而深受Java 爱好者的喜爱并得到了部分软件开发商的认可,成为目前比较流行的Web 应用服务器。

Eclipse(集成开发环境)

Eclipse(集成开发环境)

Eclipse 是一个开放源代码的、基于Java的可扩展开发平台。就其本身而言,它只是一个框架和一组服务,用于通过插件组件构建开发环境。幸运的是,Eclipse 附带了一个标准的插件集,包括Java开发工具(Java Development Kit,JDK)。

Java Development Kit(Java开发工具)

Java Development Kit(Java开发工具)

JDK是 Java 语言的软件开发工具包,主要用于移动设备、嵌入式设备上的java应用程序。JDK是整个java开发的核心,它包含了JAVA的运行环境(JVM+Java系统类库)和JAVA工具。

Sublime Text 一个代码编辑器

Sublime Text 一个代码编辑器

Sublime Text具有漂亮的用户界面和强大的功能,例如代码缩略图,Python的插件,代码段等。还可自定义键绑定,菜单和工具栏。Sublime Text 的主要功能包括:拼写检查,书签,完整的 Python API , Goto 功能,即时项目切换,多选择,多窗口等等。Sublime Text 是一个跨平台的编辑器,同时支持Windows、Linux、Mac OS X等操作系统。