3. Install FedLearn

With your virtual environment activated, use pip to install Cifer:

bash
pip install cifer

This command will download and install Cifer along with all its dependencies, including the appropriate versions of TensorFlow and PyTorch for your system.

Verify Installation

After the installation is complete, verify that Cifer has been installed correctly:

python
import cifer
print(cifer.__version__)

This should print the version number of the installed Cifer package.

Cifer Version 1.0.0

Check FedLearn Functionality

To ensure that the FedLearn module is working correctly, you can run a quick test:

python
from cifer import fedlearn
fedlearn.check_installation()

This will perform a series of checks to ensure all components are installed and configured correctly.

GPU/TPU Support Verification

If you're planning to use GPU or TPU acceleration, verify that Cifer can detect and use these resources:

python
from cifer import fedlearn
fedlearn.check_gpu_support()  # For GPU
fedlearn.check_tpu_support()  # For TPU

These commands will provide information about the available GPU or TPU resources and confirm if Cifer can utilize them.

Troubleshooting

If you encounter any issues during installation:

  1. Ensure your virtual environment is activated.

  2. Check that you have the latest version of pip.

  3. Verify that your system meets all the requirements listed in the previous sections.

  4. If problems persist, consult the Cifer documentation or reach out to the support team.

With Cifer successfully installed, you're now ready to move on to the configuration stage and begin setting up your federated learning environment.

Install and Import Dependencies

While the Cifer package includes many core functionalities, some additional libraries need to be installed separately. Follow these steps to ensure all necessary dependencies are installed and imported correctly.

Install Additional Dependencies

Run the following command in your Jupyter notebook or terminal to install the required dependencies:

bash
pip install tensorflow torch transformers scikit-learn tqdm pandas opencv-python Pillow pycryptodome

Import Dependencies

After installing the Cifer package, all necessary dependencies are included. Let's import and check these dependencies to ensure everything is set up correctly.

Import Core Dependencies

Run the following Python script to import the core dependencies and check their availability:

python
# Import Cifer and core dependencies
import cifer
import torch
import torchvision
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import grpc
import transformers
import sklearn
import pandas as pd
import cv2  # OpenCV
from PIL import Image  # Pillow
from Crypto.Cipher import AES  # PyCryptodome
from tqdm import tqdm

print(f"Cifer version: {cifer.__version__}")
print("All core dependencies successfully imported.")

# Check for GPU availability
if torch.cuda.is_available():
    device = torch.device("cuda")
    print(f"PyTorch is using GPU: {device}")
else:
    device = torch.device("cpu")
    print("PyTorch is using CPU")

# Check TensorFlow GPU availability
if tf.test.is_built_with_cuda():
    print("TensorFlow is built with CUDA support")
    print(f"TensorFlow GPU available: {tf.test.is_gpu_available()}")
else:
    print("TensorFlow is not built with CUDA support")

To check whether this Python file is supported by GPU/TPU, run the following command:

python3 import_dependencies.py

Expected output:

Cifer Version 1.0.0
All core dependencies successfully imported.
PyTorch is using CPU
TensorFlow is not built with CUDA support

Check GPU/TPU Support (Optional)

For users planning to use GPUs or TPUs, you can check their availability:

For NVIDIA GPUs:

Run the following Python script to check CUDA availability:

python
import subprocess

# Check CUDA version using nvidia-smi
try:
    result = subprocess.run(['nvidia-smi'], check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
    cuda_version = result.stdout.decode('utf-8')
    print("CUDA is available. nvidia-smi output:")
    print(cuda_version)
except FileNotFoundError:
    print("nvidia-smi command not found. CUDA might not be installed.")

For TPUs (Google Cloud):

If you're using Google Cloud TPUs, check TPU availability with:

python
import tensorflow as tf
print("Available TPU devices:")
print(tf.config.list_physical_devices('TPU'))

Troubleshooting

If you encounter any issues while importing dependencies:

  1. Ensure you've activated the virtual environment where Cifer is installed.

  2. Verify that you have the latest version of Cifer:

bash
pip install --upgrade cifer
  1. For GPU-related issues, ensure your NVIDIA drivers and CUDA toolkit are up to date and compatible with the installed TensorFlow and PyTorch versions.

  2. If problems persist, consult the Cifer documentation or contact the support team.

By running these import and check steps, you can confirm that all necessary dependencies for Cifer's FedLearn are correctly installed and ready to use, allowing you to proceed with your federated learning tasks.

Last updated