User Tools

Site Tools


start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
start [2022/02/15 18:35] – [DSL LAB IS OPEN] deadlinestart [2022/03/31 15:51] (current) deadline
Line 9: Line 9:
   * [[how_do_i#using_the_zeppelin_web_notebook|Zeppelin Notebooks]] are now full configured for Python, PySpark, Spark, Hive, and shell programming. A notebook called  **Basic Tests (Python, PySpark, sh, and Hive)** is available for learning more about Zeppelin (clone first).   * [[how_do_i#using_the_zeppelin_web_notebook|Zeppelin Notebooks]] are now full configured for Python, PySpark, Spark, Hive, and shell programming. A notebook called  **Basic Tests (Python, PySpark, sh, and Hive)** is available for learning more about Zeppelin (clone first).
   * [[how_do_i#transfer_files_to_from_the_cluster|Transferring Files from the Cloud]] has been added. The [[using_rclone|rclone]] package has been installed on all workstations (rclone is a command line tool)   * [[how_do_i#transfer_files_to_from_the_cluster|Transferring Files from the Cloud]] has been added. The [[using_rclone|rclone]] package has been installed on all workstations (rclone is a command line tool)
 +  * [[how_do_i#use_tensorflow|Python Tensorflow]] (CPU and GPU) and Keras are installed 
 **Watch this space for updates.** **Watch this space for updates.**
  
 ====About The System==== ====About The System====
  
-This computation resource is a cluster of workstations that can work together as one big systems. The system can run large Hadoop and Spark jobs using the 10 TByte Hadoop Distributed File System (HDFS) and up to 120 cores. There are also three GPU equipped nodes that will be configured to run TensorFlow.+This computation resource is a collection of nine individual workstations that can work together as a scalable data science cluster for Big Data processing. The system can run large Hadoop and Spark jobs using the 10 TByte Hadoop Distributed File System (HDFS) on up to 120 cores. There are also three GPU equipped nodes that are configured to run TensorFlow. Total system memory is 600 GBytes spread across 
 +30 separate motherboards.  
 + 
 +Each workstation provides a Linux desktop environment that supports Anaconda Navigator (Python), Rstudio, and the Zeppelin web notebook (Spark, PySpark, Hadoop Hive,HBase, Python)
  
 ====FOR HELP CLICK ON THE "How Do I" LINK BELOW==== ====FOR HELP CLICK ON THE "How Do I" LINK BELOW====
Line 26: Line 29:
  
 **System News:** **System News:**
 +  Feb-18-2022  Python Tensorflow (CPU and GPU) and Keras installed
   Feb-14-2022  Zeppelin Notebooks are configured and rclone installed   Feb-14-2022  Zeppelin Notebooks are configured and rclone installed
   Feb-07-2022  Anaconda Navigator   Feb-07-2022  Anaconda Navigator
start.1644950135.txt.gz · Last modified: 2022/02/15 18:35 by deadline

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki