ghdfs


License

License

Categories

Categories

Data
GroupId

GroupId

com.globo.bigdata
ArtifactId

ArtifactId

ghdfs_2.11
Last Version

Last Version

0.0.13
Release Date

Release Date

Type

Type

jar
Description

Description

ghdfs
ghdfs
Project URL

Project URL

https://github.com/globocom/ghdfs
Project Organization

Project Organization

Globo.com
Source Code Management

Source Code Management

https://github.com/globocom/ghdfs

Download ghdfs_2.11

How to add to project

<!-- https://jarcasting.com/artifacts/com.globo.bigdata/ghdfs_2.11/ -->
<dependency>
    <groupId>com.globo.bigdata</groupId>
    <artifactId>ghdfs_2.11</artifactId>
    <version>0.0.13</version>
</dependency>
// https://jarcasting.com/artifacts/com.globo.bigdata/ghdfs_2.11/
implementation 'com.globo.bigdata:ghdfs_2.11:0.0.13'
// https://jarcasting.com/artifacts/com.globo.bigdata/ghdfs_2.11/
implementation ("com.globo.bigdata:ghdfs_2.11:0.0.13")
'com.globo.bigdata:ghdfs_2.11:jar:0.0.13'
<dependency org="com.globo.bigdata" name="ghdfs_2.11" rev="0.0.13">
  <artifact name="ghdfs_2.11" type="jar" />
</dependency>
@Grapes(
@Grab(group='com.globo.bigdata', module='ghdfs_2.11', version='0.0.13')
)
libraryDependencies += "com.globo.bigdata" % "ghdfs_2.11" % "0.0.13"
[com.globo.bigdata/ghdfs_2.11 "0.0.13"]

Dependencies

compile (3)

Group / Artifact Type Version
org.scala-lang : scala-library jar 2.11.11
com.sksamuel.scapegoat : scalac-scapegoat-plugin_2.11 jar 1.3.2
org.apache.hadoop : hadoop-client jar [2.7.3,2.9.9]

test (2)

Group / Artifact Type Version
org.scalatest : scalatest_2.11 jar 3.0.1
org.scalamock : scalamock-scalatest-support_2.11 jar 3.6.0

Project Modules

There are no modules declared in this project.

Build Status

GHDFS

Works with HDFS for common operations and Scala compatibility.

Installation

Package is under com.globo.bigdata.ghdfs

  • Include in your dependencies:
    "com.globo.bigdata" %% "ghdfs" % "0.0.13"

Usage

    val hdfs = HdfsManager(Properties.envOrNone("HADOOP_CONF_DIR"))
    
    hdfs.write(Path)
    
    hdfs.write(Path, InputStream)

    hdfs.read(Path)
    
    hdfs.status(Path)
    
    hdfs.move(Path, Path)
    
    hdfs.listFiles(Path, recursive = false).foreach(...)
    
    hdfs.delete(Path, recursive = true)

    etc...

Get Filesystem Instance

    hdfs.getFS.exists(hadoopPath)

Contribute

For development and contributing, please follow Contributing Guide and ALWAYS respect the Code of Conduct

com.globo.bigdata

Globo.com

Versions

Version
0.0.13
0.0.12