diff --git a/README.md b/README.md index f9b85c5..c2476bc 100644 --- a/README.md +++ b/README.md @@ -1,78 +1,94 @@ -# **Hugging Face Hub Uploader Plugin for Automatic1111** +# Hugging Face Backup Extension for Stable Diffusion WebUI -Welcome to our peculiar plugin repository! This project is a unique extension of our Jupyter notebook editions, curated by a diverse DID system with a passion for experimentation. We affectionately call it "opinionated," reflecting our unconventional approach. +Welcome to our unique extension, designed to help you easily back up your valuable Stable Diffusion WebUI files to the Hugging Face Hub! This project is brought to you by the Duskfall Portal Crew, a diverse DID system passionate about creativity and AI. Screenshot 2024-11-10 at 15 02 44 +## About This Extension +This extension provides a simple way to back up your Stable Diffusion WebUI models, embeddings, and other important files to a repository on the Hugging Face Hub. We prioritize ease of use and reliability, helping you safeguard your valuable work. -## **About this Plugin** +## Key Features -This plugin enables seamless uploading of checkpoint files from Automatic1111 to the Hugging Face Hub, a popular platform for machine learning model sharing and collaboration. Our primary goal is to facilitate quick and easy uploads, while prioritizing simplicity and accessibility. +* **Easy Backup:** Back up your model files, VAEs, embeddings, and LoRAs directly from your Stable Diffusion WebUI. +* **Hugging Face Integration:** Seamlessly upload your files to your Hugging Face Hub repository. +* **Automatic Backup (Manual Start):** Backups are performed automatically in the background, after you manually start them using the UI. +* **Granular Status Updates:** Clear progress information provided during the backup process. +* **Credential Store:** The extension uses the git credential store for secure authentication, and you can opt-out by using an environment variable. -## **Features** +## Getting Started -* Upload checkpoint files from Automatic1111 to the Hugging Face Hub -* Supports multiple file uploads and batch processing -* Automatically generates a PR on the Hugging Face Hub for easy model sharing and collaboration +1. **Install:** Install this extension by placing it in the `extensions` folder of your Stable Diffusion WebUI. +2. **Configure:** + * In the Stable Diffusion WebUI settings, go to the `Hugging Face` section and set your write access token. + * In the extension's UI tab, configure your Hugging Face username, repository name, paths to back up, and the SD Webui folder. +3. **Start Backup:** Click the "Start Backup" button to begin backing up your files. -## **Getting Started** +## Requirements -1. Install the plugin by following the instructions in the Automatic1111 plugin repository. -2. Configure your Hugging Face Hub credentials and repository settings. -3. Select the checkpoint files you want to upload and click the "Upload to Hugging Face Hub" button. +* Stable Diffusion WebUI (Automatic1111) +* Hugging Face Hub account and a write access token. +* Python 3.7 or later. -## **Requirements** +## How it Works -* Automatic1111 - At this stage we're not sure which version this works with but we're testing it as we go. -* Hugging Face Hub account and credentials - Instructions will come soon on this. -* Python 3.7 or later +1. **Configuration:** When you start the extension, it will load the required settings. +2. **Cloning or Creation:** The extension will clone the provided Hugging Face repository, or create it if it doesn't exist. +3. **Copy Files:** The files in the specified paths will be copied to the cloned repository. +4. **Pushing:** The changes will be pushed to your repository in the Hugging Face Hub. +5. **Scheduled backups** By default, backups are done when the user triggers them, and then after a specified interval, you can configure that interval by modifying the `BACKUP_INTERVAL` constant in your `hfbackup_script.py` file, or by implementing a scheduler. -## **License** +## Settings -This plugin is licensed under the MIT License. +### Hugging Face Settings +* **Hugging Face Write API Key:** Required to upload to your Hugging Face Repository. +* **Use Git Credential Store:** By default the extension will try to use your system's git credentials, but you can disable this behavior by turning this toggle off, and use the environment variable `HF_TOKEN` to provide the token. -## **Acknowledgments** +### Extension settings +* **Huggingface Token**: Your Huggingface token. +* **Huggingface Username:** Your Huggingface username or organization name. +* **SD Webui Path:** The folder where Stable Diffusion Webui is installed. +* **Backup Paths:** The paths to your models, embeddings, etc. that you wish to back up (one path per line), it must be relative to the root folder where Stable Diffusion is installed. -This plugin builds upon the work of the Automatic1111 community and the Hugging Face Hub team. We appreciate their efforts in creating a robust and scalable platform for machine learning model sharing and collaboration. +## License -## **Support** +This extension is licensed under the MIT License. -If you encounter any issues or have questions about this plugin, please open an issue in the Automatic1111 plugin repository or reach out to the developer directly. +## Acknowledgments -## **Issues** +This extension is built with the help of the Automatic1111 and Hugging Face communities. We are grateful for their efforts in creating such amazing and useful projects. -Currently there is no "BACK END" settings file, so right now it auto just yeets the write API key to the tab. +## Support -Currently struggling with something, it is currently not liking me but it's at least showing up for once. +If you encounter any issues or have questions about this extension, please open an issue in our GitHub repository. -## **Changelog** +## Known Issues -* Initial release: [Insert date] -* Pre Alpha: June 8 2024 - "NOT OFFICIALLY A RELEASE" -* November 1 - 2024: Rejigged the code via Claude, will test asap and add feature logs. -* November 10 - Tested it on A111, it's SEMI working -- I just uhhh need to poke at it more? - Right now i'ts doing the WHOLE folder rather than allowing to select, so i'll be fixing that next. +* No "BACK END" settings file: Settings are saved in the script and are loaded directly by A1111. +* Currently, the backup occurs only when the user clicks the "Start Backup" button, and then on a timer, until the extension stops. + +## Changelog + +* **Initial release:** *June 8 2024* +* **Pre Alpha:** *June 8 2024* - "NOT OFFICIALLY A RELEASE" +* **Rejig:** *November 1 2024* - Rejigged the code via Claude, will test asap and add feature logs. +* **Semi Working:** *November 10 2024* - Tested it on A1111, it's SEMI working ## About & Links ### About Us -We are the Duskfall Portal Crew, a DID system with over 300 alters, navigating life with DID, ADHD, Autism, and CPTSD. We believe in AI’s potential to break down barriers and enhance mental health, despite its challenges. Join us on our creative journey exploring identity and expression. +We are the Duskfall Portal Crew, a DID system with over 300 alters, navigating life with DID, ADHD, Autism, and CPTSD. We believe in AI’s potential to break down barriers and enhance mental health. #### Join Our Community -Website: [End Media](https://end-media.org/) WEBSITE UNDER CONSTRUCTION LOOKING FOR SPONSORS - -Discord: [Join our Discord](https://discord.gg/5t2kYxt7An) - -Backups: [Hugging Face](https://huggingface.co/EarthnDusk) - -Support Us: [Send a Pizza](https://ko-fi.com/duskfallcrew/) - -Coffee: [BuyMeSomeMochas!](https://www.buymeacoffee.com/duskfallxcrew) - -Patreon: [Our Barely Used Patreon](https://www.patreon.com/earthndusk) +* **Website:** [End Media](https://end-media.org/) (WEBSITE UNDER CONSTRUCTION) +* **Discord:** [Join our Discord](https://discord.gg/5t2kYxt7An) +* **Hugging Face:** [Hugging Face](https://huggingface.co/EarthnDusk) +* **Support Us:** [Send a Pizza](https://ko-fi.com/duskfallcrew/) +* **Coffee:** [BuyMeSomeMochas!](https://www.buymeacoffee.com/duskfallxcrew) +* **Patreon:** [Our Barely Used Patreon](https://www.patreon.com/earthndusk) -Community Groups: +#### Community Groups -Subreddit: [Reddit](https://www.reddit.com/r/earthndusk/) +* **Subreddit:** [Reddit](https://www.reddit.com/r/earthndusk/) diff --git a/index.js b/index.js deleted file mode 100644 index 474ac43..0000000 --- a/index.js +++ /dev/null @@ -1,32 +0,0 @@ -const uploadComponent = { - template: ` -
- - - - - -
- `, - script: () => { - const hfUsernameInput = document.getElementById('hf-username'); - const hfRepoInput = document.getElementById('hf-repo'); - const writeKeyInput = document.getElementById('write-key'); - const ckptFilesInput = document.getElementById('ckpt-files'); - const uploadBtn = document.getElementById('upload-btn'); - - uploadBtn.addEventListener('click', async () => { - const hfUsername = hfUsernameInput.value; - const hfRepo = hfRepoInput.value; - const writeKey = writeKeyInput.value; - const ckptFiles = ckptFilesInput.files; - - try { - const response = await api.uploadToHuggingFaceHub(hfUsername, hfRepo, writeKey, ckptFiles); - console.log(response); - } catch (error) { - console.error(error); - } - }); - } -}; diff --git a/install.py b/install.py index 5189788..00c9f01 100644 --- a/install.py +++ b/install.py @@ -1,9 +1,7 @@ import launch -if not launch.is_installed("huggingface_hub"): - launch.run_pip("install huggingface_hub==4.30.2", "requirements for Hugging Face Hub Uploader") - -if not launch.is_installed("glob2"): - launch.run_pip("install glob2", "requirements for Hugging Face Hub Uploader") +if not launch.is_installed("apscheduler"): + launch.run_pip("install apscheduler", "requirements for Hugging Face Backup") -launch.run_pip("install -r requirements.txt") +if not launch.is_installed("huggingface_hub"): + launch.run_pip("install huggingface_hub", "requirements for Hugging Face Backup") diff --git a/requirements.txt b/requirements.txt index ea60cec..4b3550a 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,2 +1,2 @@ -huggingface_hub==4.30.2 -glob2 +huggingface_hub +apscheduler diff --git a/scripts/hfbackup.py b/scripts/hfbackup.py deleted file mode 100644 index 756be24..0000000 --- a/scripts/hfbackup.py +++ /dev/null @@ -1,110 +0,0 @@ -# scripts/hf_backup.py -import modules.scripts as scripts -import gradio as gr -import os -from modules import shared, script_callbacks -from huggingface_hub import HfApi -import glob - -def upload_to_huggingface(username, repo, write_key, dir_path, file_type, pr_message): - try: - # Initialize HF API - api = HfApi(token=write_key or shared.opts.data.get("hf_write_key", "")) - if not api.token: - return "Error: No Hugging Face Write API Key provided" - - repo_id = f"{username}/{repo}" - results = [] - - # Get files from directory - if dir_path: - files = glob.glob(os.path.join(dir_path, f"*.{file_type}")) - if not files: - return f"No .{file_type} files found in {dir_path}" - - for file_path in files: - try: - file_name = os.path.basename(file_path) - results.append(f"Uploading: {file_name}") - - response = api.upload_file( - path_or_fileobj=file_path, - path_in_repo=file_name, - repo_id=repo_id, - create_pr=True, - commit_message=pr_message or f"Upload {file_name}" - ) - results.append(f"✓ Successfully uploaded {file_name}") - - except Exception as e: - results.append(f"✗ Error uploading {file_name}: {str(e)}") - - return "\n".join(results) - except Exception as e: - return f"Error: {str(e)}" - -def on_ui_tabs(): - with gr.Blocks(analytics_enabled=False) as hf_interface: - with gr.Column(): - gr.HTML(""" -
-

🤗 Hugging Face Hub Uploader

-

Backup your models and files to Hugging Face

-
- """) - - with gr.Row(): - username = gr.Textbox( - label="Hugging Face Username", - placeholder="Your HF username", - value=shared.opts.data.get("hf_default_username", "") - ) - repository = gr.Textbox( - label="Repository Name", - placeholder="Name of the repository" - ) - - with gr.Row(): - write_key = gr.Textbox( - label="Write Key (optional if set in settings)", - placeholder="Your HF write token", - type="password" - ) - - with gr.Row(): - dir_picker = gr.Textbox( - label="Directory Path", - placeholder="Path to directory containing files" - ) - file_type = gr.Radio( - label="File Type", - choices=["safetensors", "ckpt", "pt", "bin", "zip", "jpg", "png"], - value="safetensors", - type="value" - ) - - pr_message = gr.Textbox( - label="Pull Request Message", - placeholder="Description of your upload", - value="Backup files" - ) - - upload_button = gr.Button("🚀 Upload to Hugging Face") - result = gr.Textbox(label="Results", interactive=False) - - upload_button.click( - fn=upload_to_huggingface, - inputs=[ - username, - repository, - write_key, - dir_picker, - file_type, - pr_message - ], - outputs=result - ) - - return [(hf_interface, "HF Backup", "hf_backup_tab")] - -script_callbacks.on_ui_tabs(on_ui_tabs) diff --git a/scripts/hfbackup_script.py b/scripts/hfbackup_script.py new file mode 100644 index 0000000..fff15d3 --- /dev/null +++ b/scripts/hfbackup_script.py @@ -0,0 +1,256 @@ +import os +import datetime +#import threading +import gradio as gr +import logging +from modules import scripts, script_callbacks, shared, paths +from modules.scripts import basedir +from huggingface_hub import HfApi, HfFolder +import shutil +from pathlib import Path +import hashlib +import gc +from apscheduler.schedulers.background import BackgroundScheduler + +# Constants +REPO_NAME = 'sd-webui-backups' +BACKUP_INTERVAL = 3600 # 1 hour in seconds +HF_TOKEN_KEY = 'hf_token' +BACKUP_PATHS_KEY = 'backup_paths' +SD_PATH_KEY = 'sd_path' +HF_USER_KEY = 'hf_user' +DEFAULT_BACKUP_PATHS = ['models/Stable-diffusion', 'models/VAE', 'embeddings', 'loras'] + +# --- Logging Setup --- +logging.basicConfig(level=logging.INFO, + format='%(asctime)s - %(levelname)s - %(message)s', + handlers=[ + logging.StreamHandler() + ]) +logger = logging.getLogger(__name__) + +# --- Helper function for updating the status --- +def update_status(script, status, file=None): + if file: + script.status = f"{status}: {file}" + print(f"{status}: {file}") # For console logging. + else: + script.status = status + print(status) # For console logging + +# --- HfApi Related Functions --- +def get_hf_token(script): + if shared.opts.hf_write_key: hf_token = shared.opts.hf_write_key + elif script.hf_token: hf_token = script.hf_token + elif HfFolder.get_token(): hf_token = HfFolder.get_token() + else: hf_token = os.getenv("HF_TOKEN") + if not hf_token: + update_status(script, "HF_TOKEN environment variable not found") + raise Exception("HF_TOKEN environment variable not found") + return hf_token + +def get_hf_user(script): + if script.hf_user: return script.hf_user + hf_token = get_hf_token(script) + api = HfApi(token=hf_token) + whoami = api.whoami(token=hf_token) + return whoami.get("name", "") if isinstance(whoami, dict) else "" + +def clone_or_create_repo(repo_id: str, repo_type: str, repo_path: str, script): + update_status(script, "Checking/Cloning Repo...") + logger.info(f"Cloning repository from {repo_id} to {repo_path}") + update_status(script, "Cloning repository") + try: + hf_token = get_hf_token(script) + api = HfApi(token=hf_token) + if os.path.exists(repo_path) and os.path.isdir(repo_path): + logger.info(f"Repository already exists at {repo_path}, updating...") + ignore_paths = get_ingore_paths(repo_path, repo_id, repo_type, hf_token) + else: + os.makedirs(repo_path, exist_ok=True) + ignore_paths = [] + if api.repo_exists(repo_id=repo_id, repo_type=repo_type, token=hf_token): + api.snapshot_download(repo_id=repo_id, repo_type=repo_type, local_dir=repo_path, ignore_patterns=ignore_paths, token=hf_token) + except Exception as e: + logger.error(f"Error creating or cloning repo: {e}") + update_status(script, f"Error creating or cloning repo: {e}") + raise + update_status(script, "Repo ready") + +def get_path_in_repo(path: str): + return str(Path(path)).replace("\\", "/") + +def get_sha256(filename: str): + if not Path(filename).exists(): return None + sha256_hash = hashlib.sha256() + with open(filename, "rb") as f: + for byte_block in iter(lambda: f.read(4096), b""): + sha256_hash.update(byte_block) + return sha256_hash.hexdigest() + +def is_same_file(filename: str, dst_repo: str, dst_type: str, dst_path: str, hf_token: str): + api = HfApi(token=hf_token) + if not filename or not Path(filename).exists() or Path(filename).is_dir(): return False + dst_path = get_path_in_repo(dst_path) + if not api.file_exists(repo_id=dst_repo, filename=dst_path, repo_type=dst_type, token=hf_token): return False + src_sha256 = get_sha256(filename) + src_size = os.path.getsize(filename) + dst_info = api.get_paths_info(repo_id=dst_repo, paths=dst_path, repo_type=dst_type, token=hf_token) + if not dst_info or len(dst_info) != 1 or dst_info[0].lfs is None: return False + if src_size == dst_info[0].size and src_sha256 == dst_info[0].lfs.sha256: return True + else: return False + +def get_ingore_paths(path: str, repo_id: str, repo_type: str, hf_token: str): + ignores = [] + for p in Path(path).glob("**/*"): + if p.is_dir(): continue + rp = p.resolve().relative_to(Path(path).resolve()) + if is_same_file(str(p), repo_id, repo_type, str(rp), hf_token): ignores.append(get_path_in_repo(str(rp))) + if len(ignores) != 0: print(f"These files are already latest: {', '.join(ignores)}") # debug + return ignores + +def safe_copy(src: str, dst: str, script): + if not Path(src).exists(): return + if Path(dst).exists() and os.path.getsize(src) == os.path.getsize(dst) and get_sha256(src) == get_sha256(dst): + logger.info(f"Skipped: {src}") + update_status(script, "Skipped", src) + return + shutil.copy2(src, dst) + logger.info(f"Copied: {src}") + update_status(script, "Copied", src) + +def hf_push_files(repo_id: str, repo_type: str, repo_path: str, commit_message: str, script): + update_status(script, "Pushing changes...") + try: + hf_token = get_hf_token(script) + api = HfApi(token=hf_token) + ignore_paths = get_ingore_paths(repo_path, repo_id, repo_type, hf_token) + api.create_repo(repo_id=repo_id, repo_type=repo_type, exist_ok=True, token=hf_token) + api.upload_folder(repo_id=repo_id, repo_type=repo_type, folder_path=repo_path, path_in_repo="", + ignore_patterns=ignore_paths, commit_message=commit_message, token=hf_token) + logger.info(f"Changes pushed successfully to remote repository.") + update_status(script, "Pushing Complete") + except Exception as e: + logger.error(f"HF push failed: {e}") + update_status(script, f"HF push failed: {e}") + raise + +# --- Backup Logic --- +def backup_files(backup_paths, script): + repo_type = "model" + logger.info("Starting backup...") + update_status(script, "Starting Backup...") + repo_id = get_hf_user(script) + "/" + script.hf_repo + repo_path = os.path.join(script.basedir, 'backup') + sd_path = script.sd_path if script.sd_path else paths.data_path + try: + clone_or_create_repo(repo_id, repo_type, repo_path, script) + except Exception as e: + logger.error("Error starting the backup, please see the traceback.") + return + for base_path in backup_paths: + logger.info(f"Backing up: {base_path}") + for root, _, files in os.walk(os.path.join(sd_path, base_path)): + for file in files: + local_file_path = os.path.join(root, file) + repo_file_path = os.path.relpath(local_file_path, start=sd_path) + try: + os.makedirs(os.path.dirname(os.path.join(repo_path, repo_file_path)), exist_ok=True) + safe_copy(local_file_path, os.path.join(repo_path, repo_file_path), script) + except Exception as e: + logger.error(f"Error copying {repo_file_path}: {e}") + update_status(script, f"Error copying: {repo_file_path}: {e}") + return + try: + hf_push_files(repo_id, repo_type, repo_path, f"Backup at {datetime.datetime.now()}", script) + logger.info("Backup complete") + update_status(script, "Backup Complete") + except Exception as e: + logger.error("Error pushing to the repo: ", e) + return + +def start_backup_thread(script, is_scheduled: bool, backup_interval: int): + backup_files(script.backup_paths, script) + #threading.Thread(target=backup_files, args=(script.backup_paths, script), daemon=True).start() + script.update_schedule(is_scheduled, backup_interval) + gc.collect() + +# Gradio UI Setup +def on_ui(script): + with gr.Blocks(analytics_enabled=False) as hf_backup: + with gr.Column(): + with gr.Row(): + with gr.Column(scale=3): + hf_token_box = gr.Textbox(label="Huggingface Token", type='password', value=script.hf_token) + def on_token_change(token: str): + script.hf_token = token + hf_token_box.change(on_token_change, inputs=[hf_token_box], outputs=None) + with gr.Column(scale=1): + status_box = gr.Textbox(label="Status", value=script.status) + with gr.Row(): + is_scheduled = gr.Checkbox(label="Scheduled backups", value=True) + backup_interval = gr.Number(label="Backup interval", step=1, minimum=60, maximum=36000, value=BACKUP_INTERVAL) + def on_start_button(is_scheduled: bool, backup_interval: int, progress=gr.Progress(track_tqdm=True)): + start_backup_thread(script, is_scheduled, backup_interval) + return "Starting Backup" + start_button = gr.Button(value="Start Backup") + start_button.click(on_start_button, inputs=[is_scheduled, backup_interval], outputs=[status_box]) + with gr.Row(): + with gr.Column(): + sd_path_box = gr.Textbox(label="SD Webui Path", value=script.sd_path) + def on_sd_path_change(path: str): + script.sd_path = path + sd_path_box.change(on_sd_path_change, inputs=[sd_path_box], outputs=None) + with gr.Column(): + hf_user_box = gr.Textbox(label="Huggingface Username", value=script.hf_user) + def on_hf_user_change(user: str): + script.hf_user = user + hf_user_box.change(on_hf_user_change, inputs=[hf_user_box], outputs=None) + with gr.Column(): + hf_repo_box = gr.Textbox(label="Huggingface Reponame", value=script.hf_repo) + def on_hf_repo_change(repo: str): + script.hf_repo = repo + hf_repo_box.change(on_hf_repo_change, inputs=[hf_repo_box], outputs=None) + with gr.Row(): + backup_paths_box = gr.Textbox(label="Backup Paths (one path per line)", lines=4, value='\n'.join(script.backup_paths) if isinstance(script.backup_paths, list) else "") + def on_backup_paths_change(paths: list): + paths_list = [p.strip() for p in paths.split('\n') if p.strip()] + script.backup_paths = paths_list + backup_paths_box.change(on_backup_paths_change, inputs=[backup_paths_box], outputs=None) + return [(hf_backup, "Huggingface Backup", "hfbackup_script")] + +class HFBackupScript(): + env = {} + + def __init__(self): + self.hf_token = self.env.get(HF_TOKEN_KEY, "") + self.backup_paths = self.env.get(BACKUP_PATHS_KEY, DEFAULT_BACKUP_PATHS) + self.sd_path = self.env.get(SD_PATH_KEY, "") + self.hf_user = self.env.get(HF_USER_KEY, "") + self.hf_repo = REPO_NAME + self.status = "Not running" + self.basedir = basedir() + self.scheduler = None + + def title(self): + return "Huggingface Backup" + + def show(self, is_img2img=None): + return scripts.AlwaysVisible + + def on_ui(self, is_img2img=None): + return on_ui(self) + + def update_schedule(self, is_scheduled: bool, backup_interval: int): + if self.scheduler is not None: + self.scheduler.shutdown() + del self.scheduler + gc.collect() + if is_scheduled: + self.scheduler = BackgroundScheduler() + self.scheduler.add_job(func=backup_files, args=[self.backup_paths, self], trigger="interval", id="backup", replace_existing=True, seconds=backup_interval) + self.scheduler.start() + +if __package__ == "hfbackup_script": + script = HFBackupScript() + script_callbacks.on_ui_tabs(script.on_ui) diff --git a/scripts/manifest.json b/scripts/manifest.json deleted file mode 100644 index 9eeb5bc..0000000 --- a/scripts/manifest.json +++ /dev/null @@ -1,14 +0,0 @@ -# manifest.json -{ - "name": "Hugging Face Backup", - "version": "1.0.0", - "description": "Backup your models and files to Hugging Face Hub", - "author": "Duskfallcrew @ Civitai / Earth & Dusk Media", - "repository": "", - "tags": ["huggingface", "backup", "models"], - "requirements": ["huggingface_hub==4.30.2", "glob2"], - "dependencies": {}, - "extension": { - "type": "tab", - "categories": ["utilities"] - } diff --git a/scripts/settings.py b/scripts/settings.py deleted file mode 100644 index 9dff717..0000000 --- a/scripts/settings.py +++ /dev/null @@ -1,30 +0,0 @@ -import gradio as gr -import os -from modules import shared, script_callbacks -from huggingface_hub import HfApi -import glob - -def on_ui_settings(): - section = ('huggingface', "Hugging Face") - shared.opts.add_option( - "hf_write_key", - shared.OptionInfo( - "", - "Hugging Face Write API Key", - gr.Password, # Changed to Password type for security - {"interactive": True}, - section=section - ) - ) - shared.opts.add_option( - "hf_read_key", - shared.OptionInfo( - "", - "Hugging Face Read API Key", - gr.Password, # Changed to Password type for security - {"interactive": True}, - section=section - ) - ) - -script_callbacks.on_ui_settings(on_ui_settings) diff --git a/scripts/ui-settings.py b/scripts/ui-settings.py new file mode 100644 index 0000000..9b7687d --- /dev/null +++ b/scripts/ui-settings.py @@ -0,0 +1,17 @@ +import gradio as gr +from modules import shared, script_callbacks + +def on_ui_settings(): + section = ('huggingface', "Hugging Face") + shared.opts.add_option( + "hf_write_key", + shared.OptionInfo( + "", + "Hugging Face Write API Key", + gr.Textbox, + {"interactive": True, "type": "password", "lines": 1}, + section=section + ) + ) + +script_callbacks.on_ui_settings(on_ui_settings) diff --git a/scripts/verify_setup.py b/scripts/verify_setup.py deleted file mode 100644 index 8a76c36..0000000 --- a/scripts/verify_setup.py +++ /dev/null @@ -1,52 +0,0 @@ -# scripts/verify_setup.py -import os -import importlib.util -import sys - -def verify_extension_setup(): - results = [] - - # Check directory structure - base_dir = os.path.dirname(os.path.dirname(__file__)) - required_files = [ - 'install.py', - 'requirements.txt', - 'scripts/hf_backup.py', - 'scripts/settings.py' - ] - - for file in required_files: - if os.path.exists(os.path.join(base_dir, file)): - results.append(f"✓ Found {file}") - else: - results.append(f"✗ Missing {file}") - - # Check dependencies - dependencies = { - 'huggingface_hub': '4.30.2', - 'glob2': '0.7' - } - - for package, version in dependencies.items(): - try: - imported = importlib.import_module(package) - actual_version = getattr(imported, '__version__', 'unknown') - if actual_version == version: - results.append(f"✓ {package} version {version} installed correctly") - else: - results.append(f"! {package} version mismatch. Expected {version}, got {actual_version}") - except ImportError: - results.append(f"✗ {package} not installed") - - # Check HF token (if set) - from modules import shared - if hasattr(shared.opts, 'data') and shared.opts.data.get("hf_write_key"): - results.append("✓ HF Write token is set") - else: - results.append("! HF Write token not set in settings") - - return "\n".join(results) - -# You can run this directly to test: -if __name__ == "__main__": - print(verify_extension_setup()) diff --git a/style.css b/style.css deleted file mode 100644 index a0cb271..0000000 --- a/style.css +++ /dev/null @@ -1,7 +0,0 @@ -# style.css -.hf-uploader-container { - margin: 1rem; - padding: 1rem; - border-radius: 8px; - background: var(--background-fill-primary); -}