Add custom nodes, Civitai loras (LFS), and vast.ai setup script
Some checks failed
Python Linting / Run Ruff (push) Has been cancelled
Python Linting / Run Pylint (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.10, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.11, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.12, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-unix-nightly (12.1, , linux, 3.11, [self-hosted Linux], nightly) (push) Has been cancelled
Execution Tests / test (macos-latest) (push) Has been cancelled
Execution Tests / test (ubuntu-latest) (push) Has been cancelled
Execution Tests / test (windows-latest) (push) Has been cancelled
Test server launches without errors / test (push) Has been cancelled
Unit Tests / test (macos-latest) (push) Has been cancelled
Unit Tests / test (ubuntu-latest) (push) Has been cancelled
Unit Tests / test (windows-2022) (push) Has been cancelled
Some checks failed
Python Linting / Run Ruff (push) Has been cancelled
Python Linting / Run Pylint (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.10, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.11, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-stable (12.1, , linux, 3.12, [self-hosted Linux], stable) (push) Has been cancelled
Full Comfy CI Workflow Runs / test-unix-nightly (12.1, , linux, 3.11, [self-hosted Linux], nightly) (push) Has been cancelled
Execution Tests / test (macos-latest) (push) Has been cancelled
Execution Tests / test (ubuntu-latest) (push) Has been cancelled
Execution Tests / test (windows-latest) (push) Has been cancelled
Test server launches without errors / test (push) Has been cancelled
Unit Tests / test (macos-latest) (push) Has been cancelled
Unit Tests / test (ubuntu-latest) (push) Has been cancelled
Unit Tests / test (windows-2022) (push) Has been cancelled
Includes 30 custom nodes committed directly, 7 Civitai-exclusive loras stored via Git LFS, and a setup script that installs all dependencies and downloads HuggingFace-hosted models on vast.ai. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
53
custom_nodes/comfyui-manager/glob/README.md
Normal file
53
custom_nodes/comfyui-manager/glob/README.md
Normal file
@@ -0,0 +1,53 @@
|
||||
# ComfyUI-Manager: Core Backend (glob)
|
||||
|
||||
This directory contains the Python backend modules that power ComfyUI-Manager, handling the core functionality of node management, downloading, security, and server operations.
|
||||
|
||||
## Core Modules
|
||||
|
||||
- **manager_core.py**: The central implementation of management functions, handling configuration, installation, updates, and node management.
|
||||
- **manager_server.py**: Implements server functionality and API endpoints for the web interface to interact with the backend.
|
||||
- **manager_downloader.py**: Handles downloading operations for models, extensions, and other resources.
|
||||
- **manager_util.py**: Provides utility functions used throughout the system.
|
||||
|
||||
## Specialized Modules
|
||||
|
||||
- **cm_global.py**: Maintains global variables and state management across the system.
|
||||
- **cnr_utils.py**: Helper utilities for interacting with the custom node registry (CNR).
|
||||
- **git_utils.py**: Git-specific utilities for repository operations.
|
||||
- **node_package.py**: Handles the packaging and installation of node extensions.
|
||||
- **security_check.py**: Implements the multi-level security system for installation safety.
|
||||
- **share_3rdparty.py**: Manages integration with third-party sharing platforms.
|
||||
|
||||
## Architecture
|
||||
|
||||
The backend follows a modular design pattern with clear separation of concerns:
|
||||
|
||||
1. **Core Layer**: Manager modules provide the primary API and business logic
|
||||
2. **Utility Layer**: Helper modules provide specialized functionality
|
||||
3. **Integration Layer**: Modules that connect to external systems
|
||||
|
||||
## Security Model
|
||||
|
||||
The system implements a comprehensive security framework with multiple levels:
|
||||
|
||||
- **Block**: Highest security - blocks most remote operations
|
||||
- **High**: Allows only specific trusted operations
|
||||
- **Middle**: Standard security for most users
|
||||
- **Normal-**: More permissive for advanced users
|
||||
- **Weak**: Lowest security for development environments
|
||||
|
||||
## Implementation Details
|
||||
|
||||
- The backend is designed to work seamlessly with ComfyUI
|
||||
- Asynchronous task queuing is implemented for background operations
|
||||
- The system supports multiple installation modes
|
||||
- Error handling and risk assessment are integrated throughout the codebase
|
||||
|
||||
## API Integration
|
||||
|
||||
The backend exposes a REST API via `manager_server.py` that enables:
|
||||
- Custom node management (install, update, disable, remove)
|
||||
- Model downloading and organization
|
||||
- System configuration
|
||||
- Snapshot management
|
||||
- Workflow component handling
|
||||
117
custom_nodes/comfyui-manager/glob/cm_global.py
Normal file
117
custom_nodes/comfyui-manager/glob/cm_global.py
Normal file
@@ -0,0 +1,117 @@
|
||||
import traceback
|
||||
|
||||
#
|
||||
# Global Var
|
||||
#
|
||||
# Usage:
|
||||
# import cm_global
|
||||
# cm_global.variables['comfyui.revision'] = 1832
|
||||
# print(f"log mode: {cm_global.variables['logger.enabled']}")
|
||||
#
|
||||
variables = {}
|
||||
|
||||
|
||||
#
|
||||
# Global API
|
||||
#
|
||||
# Usage:
|
||||
# [register API]
|
||||
# import cm_global
|
||||
#
|
||||
# def api_hello(msg):
|
||||
# print(f"hello: {msg}")
|
||||
# return msg
|
||||
#
|
||||
# cm_global.register_api('hello', api_hello)
|
||||
#
|
||||
# [use API]
|
||||
# import cm_global
|
||||
#
|
||||
# test = cm_global.try_call(api='hello', msg='an example')
|
||||
# print(f"'{test}' is returned")
|
||||
#
|
||||
|
||||
APIs = {}
|
||||
|
||||
|
||||
def register_api(k, f):
|
||||
global APIs
|
||||
APIs[k] = f
|
||||
|
||||
|
||||
def try_call(**kwargs):
|
||||
if 'api' in kwargs:
|
||||
api_name = kwargs['api']
|
||||
try:
|
||||
api = APIs.get(api_name)
|
||||
if api is not None:
|
||||
del kwargs['api']
|
||||
return api(**kwargs)
|
||||
else:
|
||||
print(f"WARN: The '{kwargs['api']}' API has not been registered.")
|
||||
except Exception as e:
|
||||
print(f"ERROR: An exception occurred while calling the '{api_name}' API.")
|
||||
raise e
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
#
|
||||
# Extension Info
|
||||
#
|
||||
# Usage:
|
||||
# import cm_global
|
||||
#
|
||||
# cm_global.extension_infos['my_extension'] = {'version': [0, 1], 'name': 'me', 'description': 'example extension', }
|
||||
#
|
||||
extension_infos = {}
|
||||
|
||||
on_extension_registered_handlers = {}
|
||||
|
||||
|
||||
def register_extension(extension_name, v):
|
||||
global extension_infos
|
||||
global on_extension_registered_handlers
|
||||
extension_infos[extension_name] = v
|
||||
|
||||
if extension_name in on_extension_registered_handlers:
|
||||
for k, f in on_extension_registered_handlers[extension_name]:
|
||||
try:
|
||||
f(extension_name, v)
|
||||
except Exception:
|
||||
print(f"[ERROR] '{k}' on_extension_registered_handlers")
|
||||
traceback.print_exc()
|
||||
|
||||
del on_extension_registered_handlers[extension_name]
|
||||
|
||||
|
||||
def add_on_extension_registered(k, extension_name, f):
|
||||
global on_extension_registered_handlers
|
||||
if extension_name in extension_infos:
|
||||
try:
|
||||
v = extension_infos[extension_name]
|
||||
f(extension_name, v)
|
||||
except Exception:
|
||||
print(f"[ERROR] '{k}' on_extension_registered_handler")
|
||||
traceback.print_exc()
|
||||
else:
|
||||
if extension_name not in on_extension_registered_handlers:
|
||||
on_extension_registered_handlers[extension_name] = []
|
||||
|
||||
on_extension_registered_handlers[extension_name].append((k, f))
|
||||
|
||||
|
||||
def add_on_revision_detected(k, f):
|
||||
if 'comfyui.revision' in variables:
|
||||
try:
|
||||
f(variables['comfyui.revision'])
|
||||
except Exception:
|
||||
print(f"[ERROR] '{k}' on_revision_detected_handler")
|
||||
traceback.print_exc()
|
||||
else:
|
||||
variables['cm.on_revision_detected_handler'].append((k, f))
|
||||
|
||||
|
||||
error_dict = {}
|
||||
|
||||
disable_front = False
|
||||
253
custom_nodes/comfyui-manager/glob/cnr_utils.py
Normal file
253
custom_nodes/comfyui-manager/glob/cnr_utils.py
Normal file
@@ -0,0 +1,253 @@
|
||||
import asyncio
|
||||
import json
|
||||
import os
|
||||
import platform
|
||||
import time
|
||||
from dataclasses import dataclass
|
||||
from typing import List
|
||||
|
||||
import manager_core
|
||||
import manager_util
|
||||
import requests
|
||||
import toml
|
||||
|
||||
base_url = "https://api.comfy.org"
|
||||
|
||||
|
||||
lock = asyncio.Lock()
|
||||
|
||||
is_cache_loading = False
|
||||
|
||||
async def get_cnr_data(cache_mode=True, dont_wait=True):
|
||||
try:
|
||||
return await _get_cnr_data(cache_mode, dont_wait)
|
||||
except asyncio.TimeoutError:
|
||||
print("A timeout occurred during the fetch process from ComfyRegistry.")
|
||||
return await _get_cnr_data(cache_mode=True, dont_wait=True) # timeout fallback
|
||||
|
||||
async def _get_cnr_data(cache_mode=True, dont_wait=True):
|
||||
global is_cache_loading
|
||||
|
||||
uri = f'{base_url}/nodes'
|
||||
|
||||
async def fetch_all():
|
||||
remained = True
|
||||
page = 1
|
||||
|
||||
full_nodes = {}
|
||||
|
||||
|
||||
# Determine form factor based on environment and platform
|
||||
is_desktop = bool(os.environ.get('__COMFYUI_DESKTOP_VERSION__'))
|
||||
system = platform.system().lower()
|
||||
is_windows = system == 'windows'
|
||||
is_mac = system == 'darwin'
|
||||
is_linux = system == 'linux'
|
||||
|
||||
# Get ComfyUI version tag
|
||||
if is_desktop:
|
||||
# extract version from pyproject.toml instead of git tag
|
||||
comfyui_ver = manager_core.get_current_comfyui_ver() or 'unknown'
|
||||
else:
|
||||
comfyui_ver = manager_core.get_comfyui_tag() or 'unknown'
|
||||
|
||||
if is_desktop:
|
||||
if is_windows:
|
||||
form_factor = 'desktop-win'
|
||||
elif is_mac:
|
||||
form_factor = 'desktop-mac'
|
||||
else:
|
||||
form_factor = 'other'
|
||||
else:
|
||||
if is_windows:
|
||||
form_factor = 'git-windows'
|
||||
elif is_mac:
|
||||
form_factor = 'git-mac'
|
||||
elif is_linux:
|
||||
form_factor = 'git-linux'
|
||||
else:
|
||||
form_factor = 'other'
|
||||
|
||||
while remained:
|
||||
# Add comfyui_version and form_factor to the API request
|
||||
sub_uri = f'{base_url}/nodes?page={page}&limit=30&comfyui_version={comfyui_ver}&form_factor={form_factor}'
|
||||
sub_json_obj = await asyncio.wait_for(manager_util.get_data_with_cache(sub_uri, cache_mode=False, silent=True, dont_cache=True), timeout=30)
|
||||
remained = page < sub_json_obj['totalPages']
|
||||
|
||||
for x in sub_json_obj['nodes']:
|
||||
full_nodes[x['id']] = x
|
||||
|
||||
if page % 5 == 0:
|
||||
print(f"FETCH ComfyRegistry Data: {page}/{sub_json_obj['totalPages']}")
|
||||
|
||||
page += 1
|
||||
time.sleep(0.5)
|
||||
|
||||
print("FETCH ComfyRegistry Data [DONE]")
|
||||
|
||||
for v in full_nodes.values():
|
||||
if 'latest_version' not in v:
|
||||
v['latest_version'] = dict(version='nightly')
|
||||
|
||||
return {'nodes': list(full_nodes.values())}
|
||||
|
||||
if cache_mode:
|
||||
is_cache_loading = True
|
||||
cache_state = manager_util.get_cache_state(uri)
|
||||
|
||||
if dont_wait:
|
||||
if cache_state == 'not-cached':
|
||||
return {}
|
||||
else:
|
||||
print("[ComfyUI-Manager] The ComfyRegistry cache update is still in progress, so an outdated cache is being used.")
|
||||
with open(manager_util.get_cache_path(uri), 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
return json.load(json_file)['nodes']
|
||||
|
||||
if cache_state == 'cached':
|
||||
with open(manager_util.get_cache_path(uri), 'r', encoding="UTF-8", errors="ignore") as json_file:
|
||||
return json.load(json_file)['nodes']
|
||||
|
||||
try:
|
||||
json_obj = await fetch_all()
|
||||
manager_util.save_to_cache(uri, json_obj)
|
||||
return json_obj['nodes']
|
||||
except:
|
||||
res = {}
|
||||
print("Cannot connect to comfyregistry.")
|
||||
finally:
|
||||
if cache_mode:
|
||||
is_cache_loading = False
|
||||
|
||||
return res
|
||||
|
||||
|
||||
@dataclass
|
||||
class NodeVersion:
|
||||
changelog: str
|
||||
dependencies: List[str]
|
||||
deprecated: bool
|
||||
id: str
|
||||
version: str
|
||||
download_url: str
|
||||
|
||||
|
||||
def map_node_version(api_node_version):
|
||||
"""
|
||||
Maps node version data from API response to NodeVersion dataclass.
|
||||
|
||||
Args:
|
||||
api_data (dict): The 'node_version' part of the API response.
|
||||
|
||||
Returns:
|
||||
NodeVersion: An instance of NodeVersion dataclass populated with data from the API.
|
||||
"""
|
||||
return NodeVersion(
|
||||
changelog=api_node_version.get(
|
||||
"changelog", ""
|
||||
), # Provide a default value if 'changelog' is missing
|
||||
dependencies=api_node_version.get(
|
||||
"dependencies", []
|
||||
), # Provide a default empty list if 'dependencies' is missing
|
||||
deprecated=api_node_version.get(
|
||||
"deprecated", False
|
||||
), # Assume False if 'deprecated' is not specified
|
||||
id=api_node_version[
|
||||
"id"
|
||||
], # 'id' should be mandatory; raise KeyError if missing
|
||||
version=api_node_version[
|
||||
"version"
|
||||
], # 'version' should be mandatory; raise KeyError if missing
|
||||
download_url=api_node_version.get(
|
||||
"downloadUrl", ""
|
||||
), # Provide a default value if 'downloadUrl' is missing
|
||||
)
|
||||
|
||||
|
||||
def install_node(node_id, version=None):
|
||||
"""
|
||||
Retrieves the node version for installation.
|
||||
|
||||
Args:
|
||||
node_id (str): The unique identifier of the node.
|
||||
version (str, optional): Specific version of the node to retrieve. If omitted, the latest version is returned.
|
||||
|
||||
Returns:
|
||||
NodeVersion: Node version data or error message.
|
||||
"""
|
||||
if version is None:
|
||||
url = f"{base_url}/nodes/{node_id}/install"
|
||||
else:
|
||||
url = f"{base_url}/nodes/{node_id}/install?version={version}"
|
||||
|
||||
response = requests.get(url, verify=not manager_util.bypass_ssl)
|
||||
if response.status_code == 200:
|
||||
# Convert the API response to a NodeVersion object
|
||||
return map_node_version(response.json())
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
def all_versions_of_node(node_id):
|
||||
url = f"{base_url}/nodes/{node_id}/versions?statuses=NodeVersionStatusActive&statuses=NodeVersionStatusPending"
|
||||
|
||||
response = requests.get(url, verify=not manager_util.bypass_ssl)
|
||||
if response.status_code == 200:
|
||||
return response.json()
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
def read_cnr_info(fullpath):
|
||||
try:
|
||||
toml_path = os.path.join(fullpath, 'pyproject.toml')
|
||||
tracking_path = os.path.join(fullpath, '.tracking')
|
||||
|
||||
if not os.path.exists(toml_path) or not os.path.exists(tracking_path):
|
||||
return None # not valid CNR node pack
|
||||
|
||||
with open(toml_path, "r", encoding="utf-8") as f:
|
||||
data = toml.load(f)
|
||||
|
||||
project = data.get('project', {})
|
||||
name = project.get('name').strip().lower()
|
||||
|
||||
# normalize version
|
||||
# for example: 2.5 -> 2.5.0
|
||||
version = str(manager_util.StrictVersion(project.get('version')))
|
||||
|
||||
urls = project.get('urls', {})
|
||||
repository = urls.get('Repository')
|
||||
|
||||
if name and version: # repository is optional
|
||||
return {
|
||||
"id": name,
|
||||
"version": version,
|
||||
"url": repository
|
||||
}
|
||||
|
||||
return None
|
||||
except Exception:
|
||||
return None # not valid CNR node pack
|
||||
|
||||
|
||||
def generate_cnr_id(fullpath, cnr_id):
|
||||
cnr_id_path = os.path.join(fullpath, '.git', '.cnr-id')
|
||||
try:
|
||||
if not os.path.exists(cnr_id_path):
|
||||
with open(cnr_id_path, "w") as f:
|
||||
return f.write(cnr_id)
|
||||
except:
|
||||
print(f"[ComfyUI Manager] unable to create file: {cnr_id_path}")
|
||||
|
||||
|
||||
def read_cnr_id(fullpath):
|
||||
cnr_id_path = os.path.join(fullpath, '.git', '.cnr-id')
|
||||
try:
|
||||
if os.path.exists(cnr_id_path):
|
||||
with open(cnr_id_path) as f:
|
||||
return f.read().strip()
|
||||
except:
|
||||
pass
|
||||
|
||||
return None
|
||||
|
||||
87
custom_nodes/comfyui-manager/glob/git_utils.py
Normal file
87
custom_nodes/comfyui-manager/glob/git_utils.py
Normal file
@@ -0,0 +1,87 @@
|
||||
import os
|
||||
import configparser
|
||||
|
||||
|
||||
GITHUB_ENDPOINT = os.getenv('GITHUB_ENDPOINT')
|
||||
|
||||
|
||||
def is_git_repo(path: str) -> bool:
|
||||
""" Check if the path is a git repository. """
|
||||
# NOTE: Checking it through `git.Repo` must be avoided.
|
||||
# It locks the file, causing issues on Windows.
|
||||
return os.path.exists(os.path.join(path, '.git'))
|
||||
|
||||
|
||||
def get_commit_hash(fullpath):
|
||||
git_head = os.path.join(fullpath, '.git', 'HEAD')
|
||||
if os.path.exists(git_head):
|
||||
with open(git_head) as f:
|
||||
line = f.readline()
|
||||
|
||||
if line.startswith("ref: "):
|
||||
ref = os.path.join(fullpath, '.git', line[5:].strip())
|
||||
if os.path.exists(ref):
|
||||
with open(ref) as f2:
|
||||
return f2.readline().strip()
|
||||
else:
|
||||
return "unknown"
|
||||
else:
|
||||
return line
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def git_url(fullpath):
|
||||
"""
|
||||
resolve version of unclassified custom node based on remote url in .git/config
|
||||
"""
|
||||
git_config_path = os.path.join(fullpath, '.git', 'config')
|
||||
|
||||
if not os.path.exists(git_config_path):
|
||||
return None
|
||||
|
||||
# Set `strict=False` to allow duplicate `vscode-merge-base` sections, addressing <https://github.com/ltdrdata/ComfyUI-Manager/issues/1529>
|
||||
config = configparser.ConfigParser(strict=False)
|
||||
config.read(git_config_path)
|
||||
|
||||
for k, v in config.items():
|
||||
if k.startswith('remote ') and 'url' in v:
|
||||
if 'Comfy-Org/ComfyUI-Manager' in v['url']:
|
||||
return "https://github.com/ltdrdata/ComfyUI-Manager"
|
||||
return v['url']
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def normalize_url(url) -> str:
|
||||
github_id = normalize_to_github_id(url)
|
||||
if github_id is not None:
|
||||
url = f"https://github.com/{github_id}"
|
||||
|
||||
return url
|
||||
|
||||
|
||||
def normalize_to_github_id(url) -> str:
|
||||
if 'github' in url or (GITHUB_ENDPOINT is not None and GITHUB_ENDPOINT in url):
|
||||
author = os.path.basename(os.path.dirname(url))
|
||||
|
||||
if author.startswith('git@github.com:'):
|
||||
author = author.split(':')[1]
|
||||
|
||||
repo_name = os.path.basename(url)
|
||||
if repo_name.endswith('.git'):
|
||||
repo_name = repo_name[:-4]
|
||||
|
||||
return f"{author}/{repo_name}"
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def get_url_for_clone(url):
|
||||
url = normalize_url(url)
|
||||
|
||||
if GITHUB_ENDPOINT is not None and url.startswith('https://github.com/'):
|
||||
url = GITHUB_ENDPOINT + url[18:] # url[18:] -> remove `https://github.com`
|
||||
|
||||
return url
|
||||
|
||||
3498
custom_nodes/comfyui-manager/glob/manager_core.py
Normal file
3498
custom_nodes/comfyui-manager/glob/manager_core.py
Normal file
File diff suppressed because it is too large
Load Diff
163
custom_nodes/comfyui-manager/glob/manager_downloader.py
Normal file
163
custom_nodes/comfyui-manager/glob/manager_downloader.py
Normal file
@@ -0,0 +1,163 @@
|
||||
import os
|
||||
from urllib.parse import urlparse
|
||||
import urllib
|
||||
import sys
|
||||
import logging
|
||||
import requests
|
||||
from huggingface_hub import HfApi
|
||||
from tqdm.auto import tqdm
|
||||
|
||||
|
||||
aria2 = os.getenv('COMFYUI_MANAGER_ARIA2_SERVER')
|
||||
HF_ENDPOINT = os.getenv('HF_ENDPOINT')
|
||||
|
||||
|
||||
if aria2 is not None:
|
||||
secret = os.getenv('COMFYUI_MANAGER_ARIA2_SECRET')
|
||||
url = urlparse(aria2)
|
||||
port = url.port
|
||||
host = url.scheme + '://' + url.hostname
|
||||
import aria2p
|
||||
|
||||
aria2 = aria2p.API(aria2p.Client(host=host, port=port, secret=secret))
|
||||
|
||||
|
||||
def basic_download_url(url, dest_folder: str, filename: str):
|
||||
'''
|
||||
Download file from url to dest_folder with filename
|
||||
using requests library.
|
||||
'''
|
||||
import requests
|
||||
|
||||
# Ensure the destination folder exists
|
||||
if not os.path.exists(dest_folder):
|
||||
os.makedirs(dest_folder)
|
||||
|
||||
# Full path to save the file
|
||||
dest_path = os.path.join(dest_folder, filename)
|
||||
|
||||
# Download the file
|
||||
response = requests.get(url, stream=True)
|
||||
if response.status_code == 200:
|
||||
with open(dest_path, 'wb') as file:
|
||||
for chunk in response.iter_content(chunk_size=1024):
|
||||
if chunk:
|
||||
file.write(chunk)
|
||||
else:
|
||||
raise Exception(f"Failed to download file from {url}")
|
||||
|
||||
|
||||
def download_url(model_url: str, model_dir: str, filename: str):
|
||||
if HF_ENDPOINT:
|
||||
model_url = model_url.replace('https://huggingface.co', HF_ENDPOINT)
|
||||
logging.info(f"model_url replaced by HF_ENDPOINT, new = {model_url}")
|
||||
if aria2:
|
||||
return aria2_download_url(model_url, model_dir, filename)
|
||||
else:
|
||||
from torchvision.datasets.utils import download_url as torchvision_download_url
|
||||
try:
|
||||
return torchvision_download_url(model_url, model_dir, filename)
|
||||
except Exception as e:
|
||||
logging.error(f"[ComfyUI-Manager] Failed to download: {model_url} / {repr(e)}")
|
||||
raise
|
||||
|
||||
|
||||
def aria2_find_task(dir: str, filename: str):
|
||||
target = os.path.join(dir, filename)
|
||||
|
||||
downloads = aria2.get_downloads()
|
||||
|
||||
for download in downloads:
|
||||
for file in download.files:
|
||||
if file.is_metadata:
|
||||
continue
|
||||
if str(file.path) == target:
|
||||
return download
|
||||
|
||||
|
||||
def aria2_download_url(model_url: str, model_dir: str, filename: str):
|
||||
import manager_core as core
|
||||
import tqdm
|
||||
import time
|
||||
|
||||
if model_dir.startswith(core.comfy_path):
|
||||
model_dir = model_dir[len(core.comfy_path) :]
|
||||
|
||||
download_dir = model_dir if model_dir.startswith('/') else os.path.join('/models', model_dir)
|
||||
|
||||
download = aria2_find_task(download_dir, filename)
|
||||
if download is None:
|
||||
options = {'dir': download_dir, 'out': filename}
|
||||
download = aria2.add(model_url, options)[0]
|
||||
|
||||
if download.is_active:
|
||||
with tqdm.tqdm(
|
||||
total=download.total_length,
|
||||
bar_format='{l_bar}{bar}{r_bar}',
|
||||
desc=filename,
|
||||
unit='B',
|
||||
unit_scale=True,
|
||||
) as progress_bar:
|
||||
while download.is_active:
|
||||
if progress_bar.total == 0 and download.total_length != 0:
|
||||
progress_bar.reset(download.total_length)
|
||||
progress_bar.update(download.completed_length - progress_bar.n)
|
||||
time.sleep(1)
|
||||
download.update()
|
||||
|
||||
|
||||
def download_url_with_agent(url, save_path):
|
||||
try:
|
||||
headers = {
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3'}
|
||||
|
||||
req = urllib.request.Request(url, headers=headers)
|
||||
response = urllib.request.urlopen(req)
|
||||
data = response.read()
|
||||
|
||||
if not os.path.exists(os.path.dirname(save_path)):
|
||||
os.makedirs(os.path.dirname(save_path))
|
||||
|
||||
with open(save_path, 'wb') as f:
|
||||
f.write(data)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Download error: {url} / {e}", file=sys.stderr)
|
||||
return False
|
||||
|
||||
print("Installation was successful.")
|
||||
return True
|
||||
|
||||
# NOTE: snapshot_download doesn't provide file size tqdm.
|
||||
def download_repo_in_bytes(repo_id, local_dir):
|
||||
api = HfApi()
|
||||
repo_info = api.repo_info(repo_id=repo_id, files_metadata=True)
|
||||
|
||||
os.makedirs(local_dir, exist_ok=True)
|
||||
|
||||
total_size = 0
|
||||
for file_info in repo_info.siblings:
|
||||
if file_info.size is not None:
|
||||
total_size += file_info.size
|
||||
|
||||
pbar = tqdm(total=total_size, unit="B", unit_scale=True, desc="Downloading")
|
||||
|
||||
for file_info in repo_info.siblings:
|
||||
out_path = os.path.join(local_dir, file_info.rfilename)
|
||||
os.makedirs(os.path.dirname(out_path), exist_ok=True)
|
||||
|
||||
if file_info.size is None:
|
||||
continue
|
||||
|
||||
download_url = f"https://huggingface.co/{repo_id}/resolve/main/{file_info.rfilename}"
|
||||
|
||||
with requests.get(download_url, stream=True) as r, open(out_path, "wb") as f:
|
||||
r.raise_for_status()
|
||||
for chunk in r.iter_content(chunk_size=65536):
|
||||
if chunk:
|
||||
f.write(chunk)
|
||||
pbar.update(len(chunk))
|
||||
|
||||
pbar.close()
|
||||
|
||||
|
||||
356
custom_nodes/comfyui-manager/glob/manager_migration.py
Normal file
356
custom_nodes/comfyui-manager/glob/manager_migration.py
Normal file
@@ -0,0 +1,356 @@
|
||||
"""
|
||||
ComfyUI-Manager migration module.
|
||||
Handles migration from legacy paths to new __manager path structure.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import configparser
|
||||
|
||||
# Startup notices for notice board
|
||||
startup_notices = [] # List of (message, level) tuples
|
||||
|
||||
|
||||
def add_startup_notice(message, level='warning'):
|
||||
"""Add a notice to be displayed on Manager notice board.
|
||||
|
||||
Args:
|
||||
message: HTML-formatted message string
|
||||
level: 'warning', 'error', 'info'
|
||||
"""
|
||||
global startup_notices
|
||||
startup_notices.append((message, level))
|
||||
|
||||
|
||||
# Cache for API check (computed once per session)
|
||||
_cached_has_system_user_api = None
|
||||
|
||||
|
||||
def has_system_user_api():
|
||||
"""Check if ComfyUI has the System User Protection API (PR #10966).
|
||||
|
||||
Result is cached for performance.
|
||||
"""
|
||||
global _cached_has_system_user_api
|
||||
if _cached_has_system_user_api is None:
|
||||
try:
|
||||
import folder_paths
|
||||
_cached_has_system_user_api = hasattr(folder_paths, 'get_system_user_directory')
|
||||
except Exception:
|
||||
_cached_has_system_user_api = False
|
||||
return _cached_has_system_user_api
|
||||
|
||||
|
||||
def get_manager_path(user_dir):
|
||||
"""Get the appropriate manager files path based on ComfyUI version.
|
||||
|
||||
Returns:
|
||||
str: manager_files_path
|
||||
"""
|
||||
if has_system_user_api():
|
||||
return os.path.abspath(os.path.join(user_dir, '__manager'))
|
||||
else:
|
||||
return os.path.abspath(os.path.join(user_dir, 'default', 'ComfyUI-Manager'))
|
||||
|
||||
|
||||
def run_migration_checks(user_dir, manager_files_path):
|
||||
"""Run all migration and security checks.
|
||||
|
||||
Call this after get_manager_path() to handle:
|
||||
- Legacy config migration (new ComfyUI)
|
||||
- Legacy backup notification (every startup)
|
||||
- Suspicious directory detection (old ComfyUI)
|
||||
- Outdated ComfyUI warning (old ComfyUI)
|
||||
"""
|
||||
if has_system_user_api():
|
||||
migrated = migrate_legacy_config(user_dir, manager_files_path)
|
||||
# Only check for legacy backup if migration didn't just happen
|
||||
# (migration already shows backup location in its message)
|
||||
if not migrated:
|
||||
check_legacy_backup(manager_files_path)
|
||||
else:
|
||||
check_suspicious_manager(user_dir)
|
||||
warn_outdated_comfyui()
|
||||
|
||||
|
||||
def check_legacy_backup(manager_files_path):
|
||||
"""Check for legacy backup and notify user to verify and remove it.
|
||||
|
||||
This runs on every startup to remind users about pending legacy backup.
|
||||
"""
|
||||
backup_dir = os.path.join(manager_files_path, '.legacy-manager-backup')
|
||||
if not os.path.exists(backup_dir):
|
||||
return
|
||||
|
||||
# Terminal output
|
||||
print("\n" + "-"*70)
|
||||
print("[ComfyUI-Manager] NOTICE: Legacy backup exists")
|
||||
print(" - Your old Manager data was backed up to:")
|
||||
print(f" {backup_dir}")
|
||||
print(" - Please verify and remove it when no longer needed.")
|
||||
print("-"*70 + "\n")
|
||||
|
||||
# Notice board output
|
||||
add_startup_notice(
|
||||
"Legacy ComfyUI-Manager data backup exists. Please verify and remove when no longer needed. See terminal for details.",
|
||||
level='info'
|
||||
)
|
||||
|
||||
|
||||
def check_suspicious_manager(user_dir):
|
||||
"""Check for suspicious __manager directory on old ComfyUI.
|
||||
|
||||
On old ComfyUI without System User API, if __manager exists with low security,
|
||||
warn the user to verify manually.
|
||||
|
||||
Returns:
|
||||
bool: True if suspicious setup detected
|
||||
"""
|
||||
if has_system_user_api():
|
||||
return False # Not suspicious on new ComfyUI
|
||||
|
||||
suspicious_path = os.path.abspath(os.path.join(user_dir, '__manager'))
|
||||
if not os.path.exists(suspicious_path):
|
||||
return False
|
||||
|
||||
config_path = os.path.join(suspicious_path, 'config.ini')
|
||||
if not os.path.exists(config_path):
|
||||
return False
|
||||
|
||||
config = configparser.ConfigParser()
|
||||
config.read(config_path)
|
||||
sec_level = config.get('default', 'security_level', fallback='normal').lower()
|
||||
|
||||
if sec_level in ['weak', 'normal-']:
|
||||
# Terminal output
|
||||
print("\n" + "!"*70)
|
||||
print("[ComfyUI-Manager] ERROR: Suspicious path detected!")
|
||||
print(f" - '__manager' exists with low security level: '{sec_level}'")
|
||||
print(" - Please verify manually:")
|
||||
print(f" {config_path}")
|
||||
print("!"*70 + "\n")
|
||||
|
||||
# Notice board output
|
||||
add_startup_notice(
|
||||
"[Security Alert] Suspicious path detected. See terminal log for details.",
|
||||
level='error'
|
||||
)
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def warn_outdated_comfyui():
|
||||
"""Warn user about outdated ComfyUI without System User API."""
|
||||
if has_system_user_api():
|
||||
return
|
||||
|
||||
# Terminal output
|
||||
print("\n" + "!"*70)
|
||||
print("[ComfyUI-Manager] ERROR: ComfyUI version is outdated!")
|
||||
print(" - Most operations are blocked for security.")
|
||||
print(" - ComfyUI update is still allowed.")
|
||||
print(" - Please update ComfyUI to use Manager normally.")
|
||||
print("!"*70 + "\n")
|
||||
|
||||
# Notice board output
|
||||
add_startup_notice(
|
||||
"[Security Alert] ComfyUI outdated. Installations blocked (update allowed).<BR>"
|
||||
"Update ComfyUI for normal operation.",
|
||||
level='error'
|
||||
)
|
||||
|
||||
|
||||
def migrate_legacy_config(user_dir, manager_files_path):
|
||||
"""Migrate ONLY config.ini to new __manager path if needed.
|
||||
|
||||
IMPORTANT: Only config.ini is migrated. Other files (snapshots, cache, etc.)
|
||||
are NOT migrated - users must recreate them.
|
||||
|
||||
Scenarios:
|
||||
1. Legacy exists, New doesn't exist → Migrate config.ini
|
||||
2. Legacy exists, New exists → First update after upgrade
|
||||
- Run ComfyUI dependency installation
|
||||
- Rename legacy to .backup
|
||||
3. Legacy doesn't exist → No migration needed
|
||||
|
||||
Returns:
|
||||
bool: True if migration was performed
|
||||
"""
|
||||
if not has_system_user_api():
|
||||
return False
|
||||
|
||||
legacy_dir = os.path.join(user_dir, 'default', 'ComfyUI-Manager')
|
||||
legacy_config = os.path.join(legacy_dir, 'config.ini')
|
||||
new_config = os.path.join(manager_files_path, 'config.ini')
|
||||
|
||||
if not os.path.exists(legacy_dir):
|
||||
return False # No legacy directory, nothing to migrate
|
||||
|
||||
# IMPORTANT: Check for config.ini existence, not just directory
|
||||
# (because makedirs() creates __manager before this function is called)
|
||||
|
||||
# Case: Both configs exist (first update after ComfyUI upgrade)
|
||||
# This means user ran new ComfyUI at least once, creating __manager/config.ini
|
||||
if os.path.exists(legacy_config) and os.path.exists(new_config):
|
||||
_handle_first_update_migration(user_dir, legacy_dir, manager_files_path)
|
||||
return True
|
||||
|
||||
# Case: Legacy config exists but new config doesn't (normal migration)
|
||||
# This is the first run after ComfyUI upgrade
|
||||
if os.path.exists(legacy_config) and not os.path.exists(new_config):
|
||||
pass # Continue with normal migration below
|
||||
else:
|
||||
return False
|
||||
|
||||
# Terminal output
|
||||
print("\n" + "-"*70)
|
||||
print("[ComfyUI-Manager] NOTICE: Legacy config.ini detected")
|
||||
print(f" - Old: {legacy_config}")
|
||||
print(f" - New: {new_config}")
|
||||
print(" - Migrating config.ini only (other files are NOT migrated).")
|
||||
print(" - Security level below 'normal' will be raised.")
|
||||
print("-"*70 + "\n")
|
||||
|
||||
_migrate_config_with_security_check(legacy_config, new_config)
|
||||
|
||||
# Move legacy directory to backup
|
||||
_move_legacy_to_backup(legacy_dir, manager_files_path)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def _handle_first_update_migration(user_dir, legacy_dir, manager_files_path):
|
||||
"""Handle first ComfyUI update when both legacy and new directories exist.
|
||||
|
||||
This scenario happens when:
|
||||
- User was on old ComfyUI (using default/ComfyUI-Manager)
|
||||
- ComfyUI was updated (now has System User API)
|
||||
- Manager already created __manager on first new run
|
||||
- But legacy directory still exists
|
||||
|
||||
Actions:
|
||||
1. Run ComfyUI dependency installation
|
||||
2. Move legacy to __manager/.legacy-manager-backup
|
||||
"""
|
||||
# Terminal output
|
||||
print("\n" + "-"*70)
|
||||
print("[ComfyUI-Manager] NOTICE: First update after ComfyUI upgrade detected")
|
||||
print(" - Both legacy and new directories exist.")
|
||||
print(" - Running ComfyUI dependency installation...")
|
||||
print("-"*70 + "\n")
|
||||
|
||||
# Run ComfyUI dependency installation
|
||||
# Path: glob/manager_migration.py → glob → comfyui-manager → custom_nodes → ComfyUI
|
||||
try:
|
||||
comfyui_path = os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
|
||||
requirements_path = os.path.join(comfyui_path, 'requirements.txt')
|
||||
if os.path.exists(requirements_path):
|
||||
subprocess.run([sys.executable, '-m', 'pip', 'install', '-r', requirements_path],
|
||||
capture_output=True, check=False)
|
||||
print("[ComfyUI-Manager] ComfyUI dependencies installation completed.")
|
||||
except Exception as e:
|
||||
print(f"[ComfyUI-Manager] WARNING: Failed to install ComfyUI dependencies: {e}")
|
||||
|
||||
# Move legacy to backup inside __manager
|
||||
_move_legacy_to_backup(legacy_dir, manager_files_path)
|
||||
|
||||
|
||||
def _move_legacy_to_backup(legacy_dir, manager_files_path):
|
||||
"""Move legacy directory to backup inside __manager.
|
||||
|
||||
Returns:
|
||||
str: Path to backup directory if successful, None if failed
|
||||
"""
|
||||
import shutil
|
||||
|
||||
backup_dir = os.path.join(manager_files_path, '.legacy-manager-backup')
|
||||
|
||||
try:
|
||||
if os.path.exists(backup_dir):
|
||||
shutil.rmtree(backup_dir) # Remove old backup if exists
|
||||
shutil.move(legacy_dir, backup_dir)
|
||||
|
||||
# Terminal output (full paths shown here only)
|
||||
print("\n" + "-"*70)
|
||||
print("[ComfyUI-Manager] NOTICE: Legacy settings migrated")
|
||||
print(f" - Old location: {legacy_dir}")
|
||||
print(f" - Backed up to: {backup_dir}")
|
||||
print(" - Please verify and remove the backup when no longer needed.")
|
||||
print("-"*70 + "\n")
|
||||
|
||||
# Notice board output (no full paths for security)
|
||||
add_startup_notice(
|
||||
"Legacy ComfyUI-Manager data migrated. See terminal for details.",
|
||||
level='info'
|
||||
)
|
||||
return backup_dir
|
||||
except Exception as e:
|
||||
print(f"[ComfyUI-Manager] WARNING: Failed to backup legacy directory: {e}")
|
||||
add_startup_notice(
|
||||
f"[MIGRATION] Failed to backup legacy directory: {e}",
|
||||
level='warning'
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
def _migrate_config_with_security_check(legacy_path, new_path):
|
||||
"""Migrate legacy config, raising security level only if below default."""
|
||||
config = configparser.ConfigParser()
|
||||
try:
|
||||
config.read(legacy_path)
|
||||
except Exception as e:
|
||||
print(f"[ComfyUI-Manager] WARNING: Failed to parse config.ini: {e}")
|
||||
print(" - Creating fresh config with default settings.")
|
||||
add_startup_notice(
|
||||
"[MIGRATION] Failed to parse legacy config. Using defaults.",
|
||||
level='warning'
|
||||
)
|
||||
return # Skip migration, let Manager create fresh config
|
||||
|
||||
# Security level hierarchy: strong > normal > normal- > weak
|
||||
# Default is 'normal', only raise if below default
|
||||
if 'default' in config:
|
||||
current_level = config['default'].get('security_level', 'normal').lower()
|
||||
below_default_levels = ['weak', 'normal-']
|
||||
|
||||
if current_level in below_default_levels:
|
||||
config['default']['security_level'] = 'normal'
|
||||
|
||||
# Terminal output
|
||||
print("\n" + "="*70)
|
||||
print("[ComfyUI-Manager] WARNING: Security level adjusted")
|
||||
print(f" - Previous: '{current_level}' → New: 'normal'")
|
||||
print(" - Raised to prevent unauthorized remote access.")
|
||||
print("="*70 + "\n")
|
||||
|
||||
# Notice board output
|
||||
add_startup_notice(
|
||||
f"[MIGRATION] Security level raised: '{current_level}' → 'normal'.<BR>"
|
||||
"To prevent unauthorized remote access.",
|
||||
level='warning'
|
||||
)
|
||||
else:
|
||||
print(f" - Security level: '{current_level}' (no change needed)")
|
||||
|
||||
# Ensure directory exists
|
||||
os.makedirs(os.path.dirname(new_path), exist_ok=True)
|
||||
|
||||
with open(new_path, 'w') as f:
|
||||
config.write(f)
|
||||
|
||||
|
||||
def force_security_level_if_needed(config_dict):
|
||||
"""Force security level to 'strong' if on old ComfyUI.
|
||||
|
||||
Args:
|
||||
config_dict: Configuration dictionary to modify in-place
|
||||
|
||||
Returns:
|
||||
bool: True if security level was forced
|
||||
"""
|
||||
if not has_system_user_api():
|
||||
config_dict['security_level'] = 'strong'
|
||||
return True
|
||||
return False
|
||||
1876
custom_nodes/comfyui-manager/glob/manager_server.py
Normal file
1876
custom_nodes/comfyui-manager/glob/manager_server.py
Normal file
File diff suppressed because it is too large
Load Diff
633
custom_nodes/comfyui-manager/glob/manager_util.py
Normal file
633
custom_nodes/comfyui-manager/glob/manager_util.py
Normal file
@@ -0,0 +1,633 @@
|
||||
"""
|
||||
description:
|
||||
`manager_util` is the lightest module shared across the prestartup_script, main code, and cm-cli of ComfyUI-Manager.
|
||||
"""
|
||||
import traceback
|
||||
|
||||
import aiohttp
|
||||
import json
|
||||
import threading
|
||||
import os
|
||||
from datetime import datetime
|
||||
import subprocess
|
||||
import sys
|
||||
import re
|
||||
import logging
|
||||
import platform
|
||||
import shlex
|
||||
from functools import lru_cache
|
||||
|
||||
|
||||
cache_lock = threading.Lock()
|
||||
|
||||
comfyui_manager_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
|
||||
cache_dir = os.path.join(comfyui_manager_path, '.cache') # This path is also updated together in **manager_core.update_user_directory**.
|
||||
|
||||
use_uv = False
|
||||
bypass_ssl = False
|
||||
|
||||
def add_python_path_to_env():
|
||||
if platform.system() != "Windows":
|
||||
sep = ':'
|
||||
else:
|
||||
sep = ';'
|
||||
|
||||
os.environ['PATH'] = os.path.dirname(sys.executable)+sep+os.environ['PATH']
|
||||
|
||||
|
||||
@lru_cache(maxsize=2)
|
||||
def get_pip_cmd(force_uv=False):
|
||||
"""
|
||||
Get the base pip command, with automatic fallback to uv if pip is unavailable.
|
||||
|
||||
Args:
|
||||
force_uv (bool): If True, use uv directly without trying pip
|
||||
|
||||
Returns:
|
||||
list: Base command for pip operations
|
||||
"""
|
||||
embedded = 'python_embeded' in sys.executable
|
||||
|
||||
# Try pip first (unless forcing uv)
|
||||
if not force_uv:
|
||||
try:
|
||||
test_cmd = [sys.executable] + (['-s'] if embedded else []) + ['-m', 'pip', '--version']
|
||||
subprocess.check_output(test_cmd, stderr=subprocess.DEVNULL, timeout=5)
|
||||
return [sys.executable] + (['-s'] if embedded else []) + ['-m', 'pip']
|
||||
except Exception:
|
||||
logging.warning("[ComfyUI-Manager] `python -m pip` not available. Falling back to `uv`.")
|
||||
|
||||
# Try uv (either forced or pip failed)
|
||||
import shutil
|
||||
|
||||
# Try uv as Python module
|
||||
try:
|
||||
test_cmd = [sys.executable] + (['-s'] if embedded else []) + ['-m', 'uv', '--version']
|
||||
subprocess.check_output(test_cmd, stderr=subprocess.DEVNULL, timeout=5)
|
||||
logging.info("[ComfyUI-Manager] Using `uv` as Python module for pip operations.")
|
||||
return [sys.executable] + (['-s'] if embedded else []) + ['-m', 'uv', 'pip']
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Try standalone uv
|
||||
if shutil.which('uv'):
|
||||
logging.info("[ComfyUI-Manager] Using standalone `uv` for pip operations.")
|
||||
return ['uv', 'pip']
|
||||
|
||||
# Nothing worked
|
||||
logging.error("[ComfyUI-Manager] Neither `python -m pip` nor `uv` are available. Cannot proceed with package operations.")
|
||||
raise Exception("Neither `pip` nor `uv` are available for package management")
|
||||
|
||||
|
||||
def make_pip_cmd(cmd):
|
||||
"""
|
||||
Create a pip command by combining the cached base pip command with the given arguments.
|
||||
|
||||
Args:
|
||||
cmd (list): List of pip command arguments (e.g., ['install', 'package'])
|
||||
|
||||
Returns:
|
||||
list: Complete command list ready for subprocess execution
|
||||
"""
|
||||
global use_uv
|
||||
base_cmd = get_pip_cmd(force_uv=use_uv)
|
||||
return base_cmd + cmd
|
||||
|
||||
|
||||
# DON'T USE StrictVersion - cannot handle pre_release version
|
||||
# try:
|
||||
# from distutils.version import StrictVersion
|
||||
# except:
|
||||
# print(f"[ComfyUI-Manager] 'distutils' package not found. Activating fallback mode for compatibility.")
|
||||
class StrictVersion:
|
||||
def __init__(self, version_string):
|
||||
self.version_string = version_string
|
||||
self.major = 0
|
||||
self.minor = 0
|
||||
self.patch = 0
|
||||
self.pre_release = None
|
||||
self.parse_version_string()
|
||||
|
||||
def parse_version_string(self):
|
||||
parts = self.version_string.split('.')
|
||||
if not parts:
|
||||
raise ValueError("Version string must not be empty")
|
||||
|
||||
self.major = int(parts[0])
|
||||
self.minor = int(parts[1]) if len(parts) > 1 else 0
|
||||
self.patch = int(parts[2]) if len(parts) > 2 else 0
|
||||
|
||||
# Handling pre-release versions if present
|
||||
if len(parts) > 3:
|
||||
self.pre_release = parts[3]
|
||||
|
||||
def __str__(self):
|
||||
version = f"{self.major}.{self.minor}.{self.patch}"
|
||||
if self.pre_release:
|
||||
version += f"-{self.pre_release}"
|
||||
return version
|
||||
|
||||
def __eq__(self, other):
|
||||
return (self.major, self.minor, self.patch, self.pre_release) == \
|
||||
(other.major, other.minor, other.patch, other.pre_release)
|
||||
|
||||
def __lt__(self, other):
|
||||
if (self.major, self.minor, self.patch) == (other.major, other.minor, other.patch):
|
||||
return self.pre_release_compare(self.pre_release, other.pre_release) < 0
|
||||
return (self.major, self.minor, self.patch) < (other.major, other.minor, other.patch)
|
||||
|
||||
@staticmethod
|
||||
def pre_release_compare(pre1, pre2):
|
||||
if pre1 == pre2:
|
||||
return 0
|
||||
if pre1 is None:
|
||||
return 1
|
||||
if pre2 is None:
|
||||
return -1
|
||||
return -1 if pre1 < pre2 else 1
|
||||
|
||||
def __le__(self, other):
|
||||
return self == other or self < other
|
||||
|
||||
def __gt__(self, other):
|
||||
return not self <= other
|
||||
|
||||
def __ge__(self, other):
|
||||
return not self < other
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
|
||||
def simple_hash(input_string):
|
||||
hash_value = 0
|
||||
for char in input_string:
|
||||
hash_value = (hash_value * 31 + ord(char)) % (2**32)
|
||||
|
||||
return hash_value
|
||||
|
||||
|
||||
def is_file_created_within_one_day(file_path):
|
||||
if not os.path.exists(file_path):
|
||||
return False
|
||||
|
||||
file_creation_time = os.path.getctime(file_path)
|
||||
current_time = datetime.now().timestamp()
|
||||
time_difference = current_time - file_creation_time
|
||||
|
||||
return time_difference <= 86400
|
||||
|
||||
|
||||
async def get_data(uri, silent=False):
|
||||
if not silent:
|
||||
print(f"FETCH DATA from: {uri}", end="")
|
||||
|
||||
if uri.startswith("http"):
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=not bypass_ssl)) as session:
|
||||
headers = {
|
||||
'Cache-Control': 'no-cache',
|
||||
'Pragma': 'no-cache',
|
||||
'Expires': '0'
|
||||
}
|
||||
async with session.get(uri, headers=headers) as resp:
|
||||
json_text = await resp.text()
|
||||
else:
|
||||
with cache_lock:
|
||||
with open(uri, "r", encoding="utf-8") as f:
|
||||
json_text = f.read()
|
||||
|
||||
try:
|
||||
json_obj = json.loads(json_text)
|
||||
except Exception as e:
|
||||
logging.error(f"[ComfyUI-Manager] An error occurred while fetching '{uri}': {e}")
|
||||
|
||||
return {}
|
||||
|
||||
if not silent:
|
||||
print(" [DONE]")
|
||||
|
||||
return json_obj
|
||||
|
||||
|
||||
def get_cache_path(uri):
|
||||
cache_uri = str(simple_hash(uri)) + '_' + os.path.basename(uri).replace('&', "_").replace('?', "_").replace('=', "_")
|
||||
return os.path.join(cache_dir, cache_uri+'.json')
|
||||
|
||||
|
||||
def get_cache_state(uri):
|
||||
cache_uri = get_cache_path(uri)
|
||||
|
||||
if not os.path.exists(cache_uri):
|
||||
return "not-cached"
|
||||
elif is_file_created_within_one_day(cache_uri):
|
||||
return "cached"
|
||||
|
||||
return "expired"
|
||||
|
||||
|
||||
def save_to_cache(uri, json_obj, silent=False):
|
||||
cache_uri = get_cache_path(uri)
|
||||
|
||||
with cache_lock:
|
||||
with open(cache_uri, "w", encoding='utf-8') as file:
|
||||
json.dump(json_obj, file, indent=4, sort_keys=True)
|
||||
if not silent:
|
||||
logging.info(f"[ComfyUI-Manager] default cache updated: {uri}")
|
||||
|
||||
|
||||
async def get_data_with_cache(uri, silent=False, cache_mode=True, dont_wait=False, dont_cache=False):
|
||||
cache_uri = get_cache_path(uri)
|
||||
|
||||
if cache_mode and dont_wait:
|
||||
# NOTE: return the cache if possible, even if it is expired, so do not cache
|
||||
if not os.path.exists(cache_uri):
|
||||
logging.error(f"[ComfyUI-Manager] The network connection is unstable, so it is operating in fallback mode: {uri}")
|
||||
|
||||
return {}
|
||||
else:
|
||||
if not is_file_created_within_one_day(cache_uri):
|
||||
logging.error(f"[ComfyUI-Manager] The network connection is unstable, so it is operating in outdated cache mode: {uri}")
|
||||
|
||||
return await get_data(cache_uri, silent=silent)
|
||||
|
||||
if cache_mode and is_file_created_within_one_day(cache_uri):
|
||||
json_obj = await get_data(cache_uri, silent=silent)
|
||||
else:
|
||||
json_obj = await get_data(uri, silent=silent)
|
||||
if not dont_cache:
|
||||
with cache_lock:
|
||||
with open(cache_uri, "w", encoding='utf-8') as file:
|
||||
json.dump(json_obj, file, indent=4, sort_keys=True)
|
||||
if not silent:
|
||||
logging.info(f"[ComfyUI-Manager] default cache updated: {uri}")
|
||||
|
||||
return json_obj
|
||||
|
||||
|
||||
def sanitize_tag(x):
|
||||
return x.replace('<', '<').replace('>', '>')
|
||||
|
||||
|
||||
def extract_package_as_zip(file_path, extract_path):
|
||||
import zipfile
|
||||
try:
|
||||
with zipfile.ZipFile(file_path, "r") as zip_ref:
|
||||
zip_ref.extractall(extract_path)
|
||||
extracted_files = zip_ref.namelist()
|
||||
logging.info(f"Extracted zip file to {extract_path}")
|
||||
return extracted_files
|
||||
except zipfile.BadZipFile:
|
||||
logging.error(f"File '{file_path}' is not a zip or is corrupted.")
|
||||
return None
|
||||
|
||||
|
||||
pip_map = None
|
||||
|
||||
|
||||
def get_installed_packages(renew=False):
|
||||
global pip_map
|
||||
|
||||
if renew or pip_map is None:
|
||||
try:
|
||||
result = subprocess.check_output(make_pip_cmd(['list']), universal_newlines=True)
|
||||
|
||||
pip_map = {}
|
||||
for line in result.split('\n'):
|
||||
x = line.strip()
|
||||
if x:
|
||||
y = line.split()
|
||||
if y[0] == 'Package' or y[0].startswith('-'):
|
||||
continue
|
||||
|
||||
normalized_name = y[0].lower().replace('-', '_')
|
||||
pip_map[normalized_name] = y[1]
|
||||
except subprocess.CalledProcessError:
|
||||
logging.error("[ComfyUI-Manager] Failed to retrieve the information of installed pip packages.")
|
||||
return {}
|
||||
|
||||
return pip_map
|
||||
|
||||
|
||||
def clear_pip_cache():
|
||||
global pip_map
|
||||
pip_map = None
|
||||
|
||||
|
||||
def parse_requirement_line(line):
|
||||
tokens = shlex.split(line)
|
||||
if not tokens:
|
||||
return None
|
||||
|
||||
package_spec = tokens[0]
|
||||
|
||||
pattern = re.compile(
|
||||
r'^(?P<package>[A-Za-z0-9_.+-]+)'
|
||||
r'(?P<operator>==|>=|<=|!=|~=|>|<)?'
|
||||
r'(?P<version>[A-Za-z0-9_.+-]*)$'
|
||||
)
|
||||
m = pattern.match(package_spec)
|
||||
if not m:
|
||||
return None
|
||||
|
||||
package = m.group('package')
|
||||
operator = m.group('operator') or None
|
||||
version = m.group('version') or None
|
||||
|
||||
index_url = None
|
||||
if '--index-url' in tokens:
|
||||
idx = tokens.index('--index-url')
|
||||
if idx + 1 < len(tokens):
|
||||
index_url = tokens[idx + 1]
|
||||
|
||||
res = {'package': package}
|
||||
|
||||
if operator is not None:
|
||||
res['operator'] = operator
|
||||
|
||||
if version is not None:
|
||||
res['version'] = StrictVersion(version)
|
||||
|
||||
if index_url is not None:
|
||||
res['index_url'] = index_url
|
||||
|
||||
return res
|
||||
|
||||
|
||||
torch_torchvision_torchaudio_version_map = {
|
||||
'2.7.0': ('0.22.0', '2.7.0'),
|
||||
'2.6.0': ('0.21.0', '2.6.0'),
|
||||
'2.5.1': ('0.20.0', '2.5.0'),
|
||||
'2.5.0': ('0.20.0', '2.5.0'),
|
||||
'2.4.1': ('0.19.1', '2.4.1'),
|
||||
'2.4.0': ('0.19.0', '2.4.0'),
|
||||
'2.3.1': ('0.18.1', '2.3.1'),
|
||||
'2.3.0': ('0.18.0', '2.3.0'),
|
||||
'2.2.2': ('0.17.2', '2.2.2'),
|
||||
'2.2.1': ('0.17.1', '2.2.1'),
|
||||
'2.2.0': ('0.17.0', '2.2.0'),
|
||||
'2.1.2': ('0.16.2', '2.1.2'),
|
||||
'2.1.1': ('0.16.1', '2.1.1'),
|
||||
'2.1.0': ('0.16.0', '2.1.0'),
|
||||
'2.0.1': ('0.15.2', '2.0.1'),
|
||||
'2.0.0': ('0.15.1', '2.0.0'),
|
||||
}
|
||||
|
||||
|
||||
def torch_rollback(prev):
|
||||
spec = prev.split('+')
|
||||
if len(spec) > 1:
|
||||
platform = spec[1]
|
||||
else:
|
||||
cmd = make_pip_cmd(['install', '--force', 'torch', 'torchvision', 'torchaudio'])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
logging.error(cmd)
|
||||
return
|
||||
|
||||
torch_ver = StrictVersion(spec[0])
|
||||
torch_ver = f"{torch_ver.major}.{torch_ver.minor}.{torch_ver.patch}"
|
||||
torch_torchvision_torchaudio_ver = torch_torchvision_torchaudio_version_map.get(torch_ver)
|
||||
|
||||
if torch_torchvision_torchaudio_ver is None:
|
||||
cmd = make_pip_cmd(['install', '--pre', 'torch', 'torchvision', 'torchaudio',
|
||||
'--index-url', f"https://download.pytorch.org/whl/nightly/{platform}"])
|
||||
logging.info("[ComfyUI-Manager] restore PyTorch to nightly version")
|
||||
else:
|
||||
torchvision_ver, torchaudio_ver = torch_torchvision_torchaudio_ver
|
||||
cmd = make_pip_cmd(['install', f'torch=={torch_ver}', f'torchvision=={torchvision_ver}', f"torchaudio=={torchaudio_ver}",
|
||||
'--index-url', f"https://download.pytorch.org/whl/{platform}"])
|
||||
logging.info(f"[ComfyUI-Manager] restore PyTorch to {torch_ver}+{platform}")
|
||||
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
|
||||
class PIPFixer:
|
||||
def __init__(self, prev_pip_versions, comfyui_path, manager_files_path):
|
||||
self.prev_pip_versions = { **prev_pip_versions }
|
||||
self.comfyui_path = comfyui_path
|
||||
self.manager_files_path = manager_files_path
|
||||
|
||||
def fix_broken(self):
|
||||
new_pip_versions = get_installed_packages(True)
|
||||
|
||||
# remove `comfy` python package
|
||||
try:
|
||||
if 'comfy' in new_pip_versions:
|
||||
cmd = make_pip_cmd(['uninstall', 'comfy'])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
logging.warning("[ComfyUI-Manager] 'comfy' python package is uninstalled.\nWARN: The 'comfy' package is completely unrelated to ComfyUI and should never be installed as it causes conflicts with ComfyUI.")
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to uninstall `comfy` python package")
|
||||
logging.error(e)
|
||||
|
||||
# fix torch - reinstall torch packages if version is changed
|
||||
try:
|
||||
if 'torch' not in self.prev_pip_versions or 'torchvision' not in self.prev_pip_versions or 'torchaudio' not in self.prev_pip_versions:
|
||||
logging.error("[ComfyUI-Manager] PyTorch is not installed")
|
||||
elif self.prev_pip_versions['torch'] != new_pip_versions['torch'] \
|
||||
or self.prev_pip_versions['torchvision'] != new_pip_versions['torchvision'] \
|
||||
or self.prev_pip_versions['torchaudio'] != new_pip_versions['torchaudio']:
|
||||
torch_rollback(self.prev_pip_versions['torch'])
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore PyTorch")
|
||||
logging.error(e)
|
||||
|
||||
# fix opencv
|
||||
try:
|
||||
ocp = new_pip_versions.get('opencv-contrib-python')
|
||||
ocph = new_pip_versions.get('opencv-contrib-python-headless')
|
||||
op = new_pip_versions.get('opencv-python')
|
||||
oph = new_pip_versions.get('opencv-python-headless')
|
||||
|
||||
versions = [ocp, ocph, op, oph]
|
||||
versions = [StrictVersion(x) for x in versions if x is not None]
|
||||
versions.sort(reverse=True)
|
||||
|
||||
if len(versions) > 0:
|
||||
# upgrade to maximum version
|
||||
targets = []
|
||||
cur = versions[0]
|
||||
if ocp is not None and StrictVersion(ocp) != cur:
|
||||
targets.append('opencv-contrib-python')
|
||||
if ocph is not None and StrictVersion(ocph) != cur:
|
||||
targets.append('opencv-contrib-python-headless')
|
||||
if op is not None and StrictVersion(op) != cur:
|
||||
targets.append('opencv-python')
|
||||
if oph is not None and StrictVersion(oph) != cur:
|
||||
targets.append('opencv-python-headless')
|
||||
|
||||
if len(targets) > 0:
|
||||
for x in targets:
|
||||
cmd = make_pip_cmd(['install', f"{x}=={versions[0].version_string}"])
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
logging.info(f"[ComfyUI-Manager] 'opencv' dependencies were fixed: {targets}")
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore opencv")
|
||||
logging.error(e)
|
||||
|
||||
# fix missing frontend
|
||||
try:
|
||||
# NOTE: package name in requirements is 'comfyui-frontend-package'
|
||||
# but, package name from `pip freeze` is 'comfyui_frontend_package'
|
||||
# but, package name from `uv pip freeze` is 'comfyui-frontend-package'
|
||||
#
|
||||
# get_installed_packages returns normalized name (i.e. comfyui_frontend_package)
|
||||
if 'comfyui_frontend_package' not in new_pip_versions:
|
||||
requirements_path = os.path.join(self.comfyui_path, 'requirements.txt')
|
||||
|
||||
with open(requirements_path, 'r') as file:
|
||||
lines = file.readlines()
|
||||
|
||||
front_line = next((line.strip() for line in lines if line.startswith('comfyui-frontend-package')), None)
|
||||
if front_line is None:
|
||||
logging.info("[ComfyUI-Manager] Skipped fixing the 'comfyui-frontend-package' dependency because the ComfyUI is outdated.")
|
||||
else:
|
||||
cmd = make_pip_cmd(['install', front_line])
|
||||
subprocess.check_output(cmd , universal_newlines=True)
|
||||
logging.info("[ComfyUI-Manager] 'comfyui-frontend-package' dependency were fixed")
|
||||
except Exception as e:
|
||||
logging.error("[ComfyUI-Manager] Failed to restore comfyui-frontend-package")
|
||||
logging.error(e)
|
||||
|
||||
# restore based on custom list
|
||||
pip_auto_fix_path = os.path.join(self.manager_files_path, "pip_auto_fix.list")
|
||||
if os.path.exists(pip_auto_fix_path):
|
||||
with open(pip_auto_fix_path, 'r', encoding="UTF-8", errors="ignore") as f:
|
||||
fixed_list = []
|
||||
|
||||
for x in f.readlines():
|
||||
try:
|
||||
parsed = parse_requirement_line(x)
|
||||
need_to_reinstall = True
|
||||
|
||||
normalized_name = parsed['package'].lower().replace('-', '_')
|
||||
if normalized_name in new_pip_versions:
|
||||
if 'version' in parsed and 'operator' in parsed:
|
||||
cur = StrictVersion(new_pip_versions[normalized_name])
|
||||
dest = parsed['version']
|
||||
op = parsed['operator']
|
||||
if cur == dest:
|
||||
if op in ['==', '>=', '<=']:
|
||||
need_to_reinstall = False
|
||||
elif cur < dest:
|
||||
if op in ['<=', '<', '~=', '!=']:
|
||||
need_to_reinstall = False
|
||||
elif cur > dest:
|
||||
if op in ['>=', '>', '~=', '!=']:
|
||||
need_to_reinstall = False
|
||||
|
||||
if need_to_reinstall:
|
||||
cmd_args = ['install']
|
||||
if 'version' in parsed and 'operator' in parsed:
|
||||
cmd_args.append(parsed['package']+parsed['operator']+parsed['version'].version_string)
|
||||
|
||||
if 'index_url' in parsed:
|
||||
cmd_args.append('--index-url')
|
||||
cmd_args.append(parsed['index_url'])
|
||||
|
||||
cmd = make_pip_cmd(cmd_args)
|
||||
subprocess.check_output(cmd, universal_newlines=True)
|
||||
|
||||
fixed_list.append(parsed['package'])
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
logging.error(f"[ComfyUI-Manager] Failed to restore '{x}'")
|
||||
logging.error(e)
|
||||
|
||||
if len(fixed_list) > 0:
|
||||
logging.info(f"[ComfyUI-Manager] dependencies in pip_auto_fix.json were fixed: {fixed_list}")
|
||||
|
||||
def sanitize(data):
|
||||
return data.replace("<", "<").replace(">", ">")
|
||||
|
||||
|
||||
def sanitize_filename(input_string):
|
||||
result_string = re.sub(r'[^a-zA-Z0-9_]', '_', input_string)
|
||||
return result_string
|
||||
|
||||
|
||||
def robust_readlines(fullpath):
|
||||
import chardet
|
||||
try:
|
||||
with open(fullpath, "r") as f:
|
||||
return f.readlines()
|
||||
except:
|
||||
encoding = None
|
||||
with open(fullpath, "rb") as f:
|
||||
raw_data = f.read()
|
||||
result = chardet.detect(raw_data)
|
||||
encoding = result['encoding']
|
||||
|
||||
if encoding is not None:
|
||||
with open(fullpath, "r", encoding=encoding) as f:
|
||||
return f.readlines()
|
||||
|
||||
print(f"[ComfyUI-Manager] Failed to recognize encoding for: {fullpath}")
|
||||
return []
|
||||
|
||||
|
||||
def restore_pip_snapshot(pips, options):
|
||||
non_url = []
|
||||
local_url = []
|
||||
non_local_url = []
|
||||
|
||||
for k, v in pips.items():
|
||||
# NOTE: skip torch related packages
|
||||
if k.startswith("torch==") or k.startswith("torchvision==") or k.startswith("torchaudio==") or k.startswith("nvidia-"):
|
||||
continue
|
||||
|
||||
if v == "":
|
||||
non_url.append(k)
|
||||
else:
|
||||
if v.startswith('file:'):
|
||||
local_url.append(v)
|
||||
else:
|
||||
non_local_url.append(v)
|
||||
|
||||
|
||||
# restore other pips
|
||||
failed = []
|
||||
if '--pip-non-url' in options:
|
||||
# try all at once
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install'] + non_url))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# fallback
|
||||
if res != 0:
|
||||
for x in non_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
if '--pip-non-local-url' in options:
|
||||
for x in non_local_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
if '--pip-local-url' in options:
|
||||
for x in local_url:
|
||||
res = 1
|
||||
try:
|
||||
res = subprocess.check_output(make_pip_cmd(['install', '--no-deps', x]))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if res != 0:
|
||||
failed.append(x)
|
||||
|
||||
print(f"Installation failed for pip packages: {failed}")
|
||||
72
custom_nodes/comfyui-manager/glob/node_package.py
Normal file
72
custom_nodes/comfyui-manager/glob/node_package.py
Normal file
@@ -0,0 +1,72 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
import os
|
||||
|
||||
from git_utils import get_commit_hash
|
||||
|
||||
|
||||
@dataclass
|
||||
class InstalledNodePackage:
|
||||
"""Information about an installed node package."""
|
||||
|
||||
id: str
|
||||
fullpath: str
|
||||
disabled: bool
|
||||
version: str
|
||||
|
||||
@property
|
||||
def is_unknown(self) -> bool:
|
||||
return self.version == "unknown"
|
||||
|
||||
@property
|
||||
def is_nightly(self) -> bool:
|
||||
return self.version == "nightly"
|
||||
|
||||
@property
|
||||
def is_from_cnr(self) -> bool:
|
||||
return not self.is_unknown and not self.is_nightly
|
||||
|
||||
@property
|
||||
def is_enabled(self) -> bool:
|
||||
return not self.disabled
|
||||
|
||||
@property
|
||||
def is_disabled(self) -> bool:
|
||||
return self.disabled
|
||||
|
||||
def get_commit_hash(self) -> str:
|
||||
return get_commit_hash(self.fullpath)
|
||||
|
||||
def isValid(self) -> bool:
|
||||
if self.is_from_cnr:
|
||||
return os.path.exists(os.path.join(self.fullpath, '.tracking'))
|
||||
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def from_fullpath(fullpath: str, resolve_from_path) -> InstalledNodePackage:
|
||||
parent_folder_name = os.path.basename(os.path.dirname(fullpath))
|
||||
module_name = os.path.basename(fullpath)
|
||||
|
||||
if module_name.endswith(".disabled"):
|
||||
node_id = module_name[:-9]
|
||||
disabled = True
|
||||
elif parent_folder_name == ".disabled":
|
||||
# Nodes under custom_nodes/.disabled/* are disabled
|
||||
node_id = module_name
|
||||
disabled = True
|
||||
else:
|
||||
node_id = module_name
|
||||
disabled = False
|
||||
|
||||
info = resolve_from_path(fullpath)
|
||||
if info is None:
|
||||
version = 'unknown'
|
||||
else:
|
||||
node_id = info['id'] # robust module guessing
|
||||
version = info['ver']
|
||||
|
||||
return InstalledNodePackage(
|
||||
id=node_id, fullpath=fullpath, disabled=disabled, version=version
|
||||
)
|
||||
124
custom_nodes/comfyui-manager/glob/security_check.py
Normal file
124
custom_nodes/comfyui-manager/glob/security_check.py
Normal file
@@ -0,0 +1,124 @@
|
||||
import sys
|
||||
import subprocess
|
||||
import os
|
||||
|
||||
import manager_util
|
||||
|
||||
|
||||
def security_check():
|
||||
print("[START] Security scan")
|
||||
|
||||
custom_nodes_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..'))
|
||||
comfyui_path = os.path.abspath(os.path.join(custom_nodes_path, '..'))
|
||||
|
||||
guide = {
|
||||
"ComfyUI_LLMVISION": """
|
||||
0.Remove ComfyUI\\custom_nodes\\ComfyUI_LLMVISION.
|
||||
1.Remove pip packages: openai-1.16.3.dist-info, anthropic-0.21.4.dist-info, openai-1.30.2.dist-info, anthropic-0.21.5.dist-info, anthropic-0.26.1.dist-info, %LocalAppData%\\rundll64.exe
|
||||
(For portable versions, it is recommended to reinstall. If you are using a venv, it is advised to recreate the venv.)
|
||||
2.Remove these files in your system: lib/browser/admin.py, Cadmino.py, Fadmino.py, VISION-D.exe, BeamNG.UI.exe
|
||||
3.Check your Windows registry for the key listed above and remove it.
|
||||
(HKEY_CURRENT_USER\\Software\\OpenAICLI)
|
||||
4.Run a malware scanner.
|
||||
5.Change all of your passwords, everywhere.
|
||||
|
||||
(Reinstall OS is recommended.)
|
||||
\n
|
||||
Detailed information: https://old.reddit.com/r/comfyui/comments/1dbls5n/psa_if_youve_used_the_comfyui_llmvision_node_from/
|
||||
""",
|
||||
"lolMiner": """
|
||||
1. Remove pip packages: lolMiner*
|
||||
2. Remove files: lolMiner*, 4G_Ethash_Linux_Readme.txt, mine* in ComfyUI dir.
|
||||
|
||||
(Reinstall ComfyUI is recommended.)
|
||||
""",
|
||||
"ultralytics==8.3.41": f"""
|
||||
Execute following commands:
|
||||
{sys.executable} -m pip uninstall ultralytics
|
||||
{sys.executable} -m pip install ultralytics==8.3.40
|
||||
|
||||
And kill and remove /tmp/ultralytics_runner
|
||||
|
||||
|
||||
The version 8.3.41 to 8.3.42 of the Ultralytics package you installed is compromised. Please uninstall that version and reinstall the latest version.
|
||||
https://blog.comfy.org/comfyui-statement-on-the-ultralytics-crypto-miner-situation/
|
||||
""",
|
||||
"ultralytics==8.3.42": f"""
|
||||
Execute following commands:
|
||||
{sys.executable} -m pip uninstall ultralytics
|
||||
{sys.executable} -m pip install ultralytics==8.3.40
|
||||
|
||||
And kill and remove /tmp/ultralytics_runner
|
||||
|
||||
|
||||
The version 8.3.41 to 8.3.42 of the Ultralytics package you installed is compromised. Please uninstall that version and reinstall the latest version.
|
||||
https://blog.comfy.org/comfyui-statement-on-the-ultralytics-crypto-miner-situation/
|
||||
"""
|
||||
}
|
||||
|
||||
node_blacklist = {"ComfyUI_LLMVISION": "ComfyUI_LLMVISION"}
|
||||
|
||||
pip_blacklist = {
|
||||
"AppleBotzz": "ComfyUI_LLMVISION",
|
||||
"ultralytics==8.3.41": "ultralytics==8.3.41"
|
||||
}
|
||||
|
||||
file_blacklist = {
|
||||
"ComfyUI_LLMVISION": ["%LocalAppData%\\rundll64.exe"],
|
||||
"lolMiner": [os.path.join(comfyui_path, 'lolMiner')]
|
||||
}
|
||||
|
||||
installed_pips = subprocess.check_output(manager_util.make_pip_cmd(["freeze"]), text=True)
|
||||
|
||||
detected = set()
|
||||
try:
|
||||
anthropic_info = subprocess.check_output(manager_util.make_pip_cmd(["show", "anthropic"]), text=True, stderr=subprocess.DEVNULL)
|
||||
requires_lines = [x for x in anthropic_info.split('\n') if x.startswith("Requires")]
|
||||
if requires_lines:
|
||||
anthropic_reqs = requires_lines[0].split(": ", 1)[1]
|
||||
if "pycrypto" in anthropic_reqs:
|
||||
location_lines = [x for x in anthropic_info.split('\n') if x.startswith("Location")]
|
||||
if location_lines:
|
||||
location = location_lines[0].split(": ", 1)[1]
|
||||
for fi in os.listdir(location):
|
||||
if fi.startswith("anthropic"):
|
||||
guide["ComfyUI_LLMVISION"] = (f"\n0.Remove {os.path.join(location, fi)}" + guide["ComfyUI_LLMVISION"])
|
||||
detected.add("ComfyUI_LLMVISION")
|
||||
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
|
||||
for k, v in node_blacklist.items():
|
||||
if os.path.exists(os.path.join(custom_nodes_path, k)):
|
||||
print(f"[SECURITY ALERT] custom node '{k}' is dangerous.")
|
||||
detected.add(v)
|
||||
|
||||
for k, v in pip_blacklist.items():
|
||||
if k in installed_pips:
|
||||
detected.add(v)
|
||||
break
|
||||
|
||||
for k, v in file_blacklist.items():
|
||||
for x in v:
|
||||
if os.path.exists(os.path.expandvars(x)):
|
||||
detected.add(k)
|
||||
break
|
||||
|
||||
if len(detected) > 0:
|
||||
for line in installed_pips.split('\n'):
|
||||
for k, v in pip_blacklist.items():
|
||||
if k in line:
|
||||
print(f"[SECURITY ALERT] '{line}' is dangerous.")
|
||||
|
||||
print("\n########################################################################")
|
||||
print(" Malware has been detected, forcibly terminating ComfyUI execution.")
|
||||
print("########################################################################\n")
|
||||
|
||||
for x in detected:
|
||||
print(f"\n======== TARGET: {x} =========")
|
||||
print("\nTODO:")
|
||||
print(guide.get(x))
|
||||
|
||||
exit(-1)
|
||||
|
||||
print("[DONE] Security scan")
|
||||
433
custom_nodes/comfyui-manager/glob/share_3rdparty.py
Normal file
433
custom_nodes/comfyui-manager/glob/share_3rdparty.py
Normal file
@@ -0,0 +1,433 @@
|
||||
import mimetypes
|
||||
import manager_core as core
|
||||
import os
|
||||
from aiohttp import web
|
||||
import aiohttp
|
||||
import json
|
||||
import hashlib
|
||||
|
||||
import folder_paths
|
||||
from server import PromptServer
|
||||
|
||||
|
||||
def extract_model_file_names(json_data):
|
||||
"""Extract unique file names from the input JSON data."""
|
||||
file_names = set()
|
||||
model_filename_extensions = {'.safetensors', '.ckpt', '.pt', '.pth', '.bin'}
|
||||
|
||||
# Recursively search for file names in the JSON data
|
||||
def recursive_search(data):
|
||||
if isinstance(data, dict):
|
||||
for value in data.values():
|
||||
recursive_search(value)
|
||||
elif isinstance(data, list):
|
||||
for item in data:
|
||||
recursive_search(item)
|
||||
elif isinstance(data, str) and '.' in data:
|
||||
file_names.add(os.path.basename(data)) # file_names.add(data)
|
||||
|
||||
recursive_search(json_data)
|
||||
return [f for f in list(file_names) if os.path.splitext(f)[1] in model_filename_extensions]
|
||||
|
||||
|
||||
def find_file_paths(base_dir, file_names):
|
||||
"""Find the paths of the files in the base directory."""
|
||||
file_paths = {}
|
||||
|
||||
for root, dirs, files in os.walk(base_dir):
|
||||
# Exclude certain directories
|
||||
dirs[:] = [d for d in dirs if d not in ['.git']]
|
||||
|
||||
for file in files:
|
||||
if file in file_names:
|
||||
file_paths[file] = os.path.join(root, file)
|
||||
return file_paths
|
||||
|
||||
|
||||
def compute_sha256_checksum(filepath):
|
||||
"""Compute the SHA256 checksum of a file, in chunks"""
|
||||
sha256 = hashlib.sha256()
|
||||
with open(filepath, 'rb') as f:
|
||||
for chunk in iter(lambda: f.read(4096), b''):
|
||||
sha256.update(chunk)
|
||||
return sha256.hexdigest()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/share_option")
|
||||
async def share_option(request):
|
||||
if "value" in request.rel_url.query:
|
||||
core.get_config()['share_option'] = request.rel_url.query['value']
|
||||
core.write_config()
|
||||
else:
|
||||
return web.Response(text=core.get_config()['share_option'], status=200)
|
||||
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
def get_openart_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, ".openart_key")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, ".openart_key"), "r") as f:
|
||||
openart_key = f.read().strip()
|
||||
return openart_key if openart_key else None
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def get_matrix_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, "matrix_auth")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, "matrix_auth"), "r") as f:
|
||||
matrix_auth = f.read()
|
||||
homeserver, username, password = matrix_auth.strip().split("\n")
|
||||
if not homeserver or not username or not password:
|
||||
return None
|
||||
return {
|
||||
"homeserver": homeserver,
|
||||
"username": username,
|
||||
"password": password,
|
||||
}
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def get_comfyworkflows_auth():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, "comfyworkflows_sharekey")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, "comfyworkflows_sharekey"), "r") as f:
|
||||
share_key = f.read()
|
||||
if not share_key.strip():
|
||||
return None
|
||||
return share_key
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def get_youml_settings():
|
||||
if not os.path.exists(os.path.join(core.manager_files_path, ".youml")):
|
||||
return None
|
||||
try:
|
||||
with open(os.path.join(core.manager_files_path, ".youml"), "r") as f:
|
||||
youml_settings = f.read().strip()
|
||||
return youml_settings if youml_settings else None
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
def set_youml_settings(settings):
|
||||
with open(os.path.join(core.manager_files_path, ".youml"), "w") as f:
|
||||
f.write(settings)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_openart_auth")
|
||||
async def api_get_openart_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
openart_key = get_openart_auth()
|
||||
if not openart_key:
|
||||
return web.Response(status=404)
|
||||
return web.json_response({"openart_key": openart_key})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/set_openart_auth")
|
||||
async def api_set_openart_auth(request):
|
||||
json_data = await request.json()
|
||||
openart_key = json_data['openart_key']
|
||||
with open(os.path.join(core.manager_files_path, ".openart_key"), "w") as f:
|
||||
f.write(openart_key)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_matrix_auth")
|
||||
async def api_get_matrix_auth(request):
|
||||
# print("Getting stored Matrix credentials...")
|
||||
matrix_auth = get_matrix_auth()
|
||||
if not matrix_auth:
|
||||
return web.Response(status=404)
|
||||
return web.json_response(matrix_auth)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/youml/settings")
|
||||
async def api_get_youml_settings(request):
|
||||
youml_settings = get_youml_settings()
|
||||
if not youml_settings:
|
||||
return web.Response(status=404)
|
||||
return web.json_response(json.loads(youml_settings))
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/youml/settings")
|
||||
async def api_set_youml_settings(request):
|
||||
json_data = await request.json()
|
||||
set_youml_settings(json.dumps(json_data))
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_comfyworkflows_auth")
|
||||
async def api_get_comfyworkflows_auth(request):
|
||||
# Check if the user has provided Matrix credentials in a file called 'matrix_accesstoken'
|
||||
# in the same directory as the ComfyUI base folder
|
||||
# print("Getting stored Comfyworkflows.com auth...")
|
||||
comfyworkflows_auth = get_comfyworkflows_auth()
|
||||
if not comfyworkflows_auth:
|
||||
return web.Response(status=404)
|
||||
return web.json_response({"comfyworkflows_sharekey": comfyworkflows_auth})
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/set_esheep_workflow_and_images")
|
||||
async def set_esheep_workflow_and_images(request):
|
||||
json_data = await request.json()
|
||||
with open(os.path.join(core.manager_files_path, "esheep_share_message.json"), "w", encoding='utf-8') as file:
|
||||
json.dump(json_data, file, indent=4)
|
||||
return web.Response(status=200)
|
||||
|
||||
|
||||
@PromptServer.instance.routes.get("/manager/get_esheep_workflow_and_images")
|
||||
async def get_esheep_workflow_and_images(request):
|
||||
with open(os.path.join(core.manager_files_path, "esheep_share_message.json"), 'r', encoding='utf-8') as file:
|
||||
data = json.load(file)
|
||||
return web.Response(status=200, text=json.dumps(data))
|
||||
|
||||
|
||||
def set_matrix_auth(json_data):
|
||||
homeserver = json_data['homeserver']
|
||||
username = json_data['username']
|
||||
password = json_data['password']
|
||||
with open(os.path.join(core.manager_files_path, "matrix_auth"), "w") as f:
|
||||
f.write("\n".join([homeserver, username, password]))
|
||||
|
||||
|
||||
def set_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
with open(os.path.join(core.manager_files_path, "comfyworkflows_sharekey"), "w") as f:
|
||||
f.write(comfyworkflows_sharekey)
|
||||
|
||||
|
||||
def has_provided_matrix_auth(matrix_auth):
|
||||
return matrix_auth['homeserver'].strip() and matrix_auth['username'].strip() and matrix_auth['password'].strip()
|
||||
|
||||
|
||||
def has_provided_comfyworkflows_auth(comfyworkflows_sharekey):
|
||||
return comfyworkflows_sharekey.strip()
|
||||
|
||||
|
||||
@PromptServer.instance.routes.post("/manager/share")
|
||||
async def share_art(request):
|
||||
# get json data
|
||||
json_data = await request.json()
|
||||
|
||||
matrix_auth = json_data['matrix_auth']
|
||||
comfyworkflows_sharekey = json_data['cw_auth']['cw_sharekey']
|
||||
|
||||
set_matrix_auth(matrix_auth)
|
||||
set_comfyworkflows_auth(comfyworkflows_sharekey)
|
||||
|
||||
share_destinations = json_data['share_destinations']
|
||||
credits = json_data['credits']
|
||||
title = json_data['title']
|
||||
description = json_data['description']
|
||||
is_nsfw = json_data['is_nsfw']
|
||||
prompt = json_data['prompt']
|
||||
potential_outputs = json_data['potential_outputs']
|
||||
selected_output_index = json_data['selected_output_index']
|
||||
|
||||
try:
|
||||
output_to_share = potential_outputs[int(selected_output_index)]
|
||||
except:
|
||||
# for now, pick the first output
|
||||
output_to_share = potential_outputs[0]
|
||||
|
||||
assert output_to_share['type'] in ('image', 'output')
|
||||
output_dir = folder_paths.get_output_directory()
|
||||
|
||||
if output_to_share['type'] == 'image':
|
||||
asset_filename = output_to_share['image']['filename']
|
||||
asset_subfolder = output_to_share['image']['subfolder']
|
||||
|
||||
if output_to_share['image']['type'] == 'temp':
|
||||
output_dir = folder_paths.get_temp_directory()
|
||||
else:
|
||||
asset_filename = output_to_share['output']['filename']
|
||||
asset_subfolder = output_to_share['output']['subfolder']
|
||||
|
||||
if asset_subfolder:
|
||||
asset_filepath = os.path.join(output_dir, asset_subfolder, asset_filename)
|
||||
else:
|
||||
asset_filepath = os.path.join(output_dir, asset_filename)
|
||||
|
||||
# get the mime type of the asset
|
||||
assetFileType = mimetypes.guess_type(asset_filepath)[0]
|
||||
|
||||
share_website_host = "UNKNOWN"
|
||||
if "comfyworkflows" in share_destinations:
|
||||
share_website_host = "https://comfyworkflows.com"
|
||||
share_endpoint = f"{share_website_host}/api"
|
||||
|
||||
# get presigned urls
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.post(
|
||||
f"{share_endpoint}/get_presigned_urls",
|
||||
json={
|
||||
"assetFileName": asset_filename,
|
||||
"assetFileType": assetFileType,
|
||||
"workflowJsonFileName": 'workflow.json',
|
||||
"workflowJsonFileType": 'application/json',
|
||||
},
|
||||
) as resp:
|
||||
assert resp.status == 200
|
||||
presigned_urls_json = await resp.json()
|
||||
assetFilePresignedUrl = presigned_urls_json["assetFilePresignedUrl"]
|
||||
assetFileKey = presigned_urls_json["assetFileKey"]
|
||||
workflowJsonFilePresignedUrl = presigned_urls_json["workflowJsonFilePresignedUrl"]
|
||||
workflowJsonFileKey = presigned_urls_json["workflowJsonFileKey"]
|
||||
|
||||
# upload asset
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.put(assetFilePresignedUrl, data=open(asset_filepath, "rb")) as resp:
|
||||
assert resp.status == 200
|
||||
|
||||
# upload workflow json
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
async with session.put(workflowJsonFilePresignedUrl, data=json.dumps(prompt['workflow']).encode('utf-8')) as resp:
|
||||
assert resp.status == 200
|
||||
|
||||
model_filenames = extract_model_file_names(prompt['workflow'])
|
||||
model_file_paths = find_file_paths(folder_paths.base_path, model_filenames)
|
||||
|
||||
models_info = {}
|
||||
for filename, filepath in model_file_paths.items():
|
||||
models_info[filename] = {
|
||||
"filename": filename,
|
||||
"sha256_checksum": compute_sha256_checksum(filepath),
|
||||
"relative_path": os.path.relpath(filepath, folder_paths.base_path),
|
||||
}
|
||||
|
||||
# make a POST request to /api/upload_workflow with form data key values
|
||||
async with aiohttp.ClientSession(trust_env=True, connector=aiohttp.TCPConnector(verify_ssl=False)) as session:
|
||||
form = aiohttp.FormData()
|
||||
if comfyworkflows_sharekey:
|
||||
form.add_field("shareKey", comfyworkflows_sharekey)
|
||||
form.add_field("source", "comfyui_manager")
|
||||
form.add_field("assetFileKey", assetFileKey)
|
||||
form.add_field("assetFileType", assetFileType)
|
||||
form.add_field("workflowJsonFileKey", workflowJsonFileKey)
|
||||
form.add_field("sharedWorkflowWorkflowJsonString", json.dumps(prompt['workflow']))
|
||||
form.add_field("sharedWorkflowPromptJsonString", json.dumps(prompt['output']))
|
||||
form.add_field("shareWorkflowCredits", credits)
|
||||
form.add_field("shareWorkflowTitle", title)
|
||||
form.add_field("shareWorkflowDescription", description)
|
||||
form.add_field("shareWorkflowIsNSFW", str(is_nsfw).lower())
|
||||
form.add_field("currentSnapshot", json.dumps(await core.get_current_snapshot()))
|
||||
form.add_field("modelsInfo", json.dumps(models_info))
|
||||
|
||||
async with session.post(
|
||||
f"{share_endpoint}/upload_workflow",
|
||||
data=form,
|
||||
) as resp:
|
||||
assert resp.status == 200
|
||||
upload_workflow_json = await resp.json()
|
||||
workflowId = upload_workflow_json["workflowId"]
|
||||
|
||||
# check if the user has provided Matrix credentials
|
||||
if "matrix" in share_destinations:
|
||||
comfyui_share_room_id = '!LGYSoacpJPhIfBqVfb:matrix.org'
|
||||
filename = os.path.basename(asset_filepath)
|
||||
content_type = assetFileType
|
||||
|
||||
try:
|
||||
from nio import AsyncClient, LoginResponse, UploadResponse
|
||||
|
||||
homeserver = 'matrix.org'
|
||||
if matrix_auth:
|
||||
homeserver = matrix_auth.get('homeserver', 'matrix.org')
|
||||
homeserver = homeserver.replace("http://", "https://")
|
||||
if not homeserver.startswith("https://"):
|
||||
homeserver = "https://" + homeserver
|
||||
|
||||
client = AsyncClient(homeserver, matrix_auth['username'])
|
||||
|
||||
# Login
|
||||
login_resp = await client.login(matrix_auth['password'])
|
||||
if not isinstance(login_resp, LoginResponse) or not login_resp.access_token:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Invalid Matrix credentials."}, content_type='application/json', status=400)
|
||||
|
||||
# Upload asset
|
||||
with open(asset_filepath, 'rb') as f:
|
||||
upload_resp, _maybe_keys = await client.upload(f, content_type=content_type, filename=filename)
|
||||
asset_data = f.seek(0) or f.read() # get size for info below
|
||||
if not isinstance(upload_resp, UploadResponse) or not upload_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload asset to Matrix."}, content_type='application/json', status=500)
|
||||
mxc_url = upload_resp.content_uri
|
||||
|
||||
# Upload workflow JSON
|
||||
import io
|
||||
workflow_json_bytes = json.dumps(prompt['workflow']).encode('utf-8')
|
||||
workflow_io = io.BytesIO(workflow_json_bytes)
|
||||
upload_workflow_resp, _maybe_keys = await client.upload(workflow_io, content_type='application/json', filename='workflow.json')
|
||||
workflow_io.seek(0)
|
||||
if not isinstance(upload_workflow_resp, UploadResponse) or not upload_workflow_resp.content_uri:
|
||||
await client.close()
|
||||
return web.json_response({"error": "Failed to upload workflow to Matrix."}, content_type='application/json', status=500)
|
||||
workflow_json_mxc_url = upload_workflow_resp.content_uri
|
||||
|
||||
# Send text message
|
||||
text_content = ""
|
||||
if title:
|
||||
text_content += f"{title}\n"
|
||||
if description:
|
||||
text_content += f"{description}\n"
|
||||
if credits:
|
||||
text_content += f"\ncredits: {credits}\n"
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={"msgtype": "m.text", "body": text_content}
|
||||
)
|
||||
|
||||
# Send image
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.image",
|
||||
"body": filename,
|
||||
"url": mxc_url,
|
||||
"info": {
|
||||
"mimetype": content_type,
|
||||
"size": len(asset_data)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Send workflow JSON file
|
||||
await client.room_send(
|
||||
room_id=comfyui_share_room_id,
|
||||
message_type="m.room.message",
|
||||
content={
|
||||
"msgtype": "m.file",
|
||||
"body": "workflow.json",
|
||||
"url": workflow_json_mxc_url,
|
||||
"info": {
|
||||
"mimetype": "application/json",
|
||||
"size": len(workflow_json_bytes)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
await client.close()
|
||||
|
||||
except:
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return web.json_response({"error": "An error occurred when sharing your art to Matrix."}, content_type='application/json', status=500)
|
||||
|
||||
return web.json_response({
|
||||
"comfyworkflows": {
|
||||
"url": None if "comfyworkflows" not in share_destinations else f"{share_website_host}/workflows/{workflowId}",
|
||||
},
|
||||
"matrix": {
|
||||
"success": None if "matrix" not in share_destinations else True
|
||||
}
|
||||
}, content_type='application/json', status=200)
|
||||
Reference in New Issue
Block a user