Skip to content

Latest commit

 

History

History
74 lines (50 loc) · 1.97 KB

README.md

File metadata and controls

74 lines (50 loc) · 1.97 KB

Edge Vision Framework (EVF)

A web application framework for managing and optimizing deep learning models for edge devices with GPU support and real-time monitoring capabilities.

Key Features

  • 🚀 Project Management: Organize models, datasets, and experiments in isolated workspaces
  • 🎯 Model Training: Train models with automatic GPU resource allocation
  • 📊 Real-time Monitoring: Track training progress and system resource usage
  • 🛠️ Model Optimization: Optimize models for edge deployment
  • 📈 Interactive Dashboard: Monitor system resources and training metrics
  • 👥 Multi-user Support: Secure authentication and workspace isolation

Installation

Prerequisites

  • Python 3.8+
  • CUDA-compatible GPU
  • Node.js 14+ (for frontend development)

Setup

  1. Clone the repository:
git clone https://github.com/yourusername/nn-comp-evf.git
cd nn-comp-evf
  1. Create and activate a virtual environment:
python -m venv venv
source venv/bin/activate  # On Windows, use: venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Initialize the application:
python init.sh
  1. Start the server:
python app.py

The application will be available at http://localhost:5001

Basic Usage

  1. Create a new project from the dashboard
  2. Upload or configure your dataset
  3. Define your model architecture
  4. Start a training run with GPU allocation
  5. Monitor training progress in real-time

License

Apache 2.0 License - see LICENSE for details

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Acknowledgement

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No. 2021-0-00907, Development of Adaptive and Lightweight Edge-Collaborative Analysis Technology for Enabling Proactively Immediate Response and Rapid Learning).