Dify
English
English
  • Getting Started
    • Welcome to Dify
      • Features and Specifications
      • List of Model Providers
    • Dify Community
      • Deploy with Docker Compose
      • Start with Local Source Code
      • Deploy with aaPanel
      • Start Frontend Docker Container Separately
      • Environment Variables Explanation
      • FAQs
    • Dify Cloud
    • Dify Premium on AWS
    • Dify for Education
  • Guides
    • Model
      • Add New Provider
      • Predefined Model Integration
      • Custom Model Integration
      • Interfaces
      • Schema
      • Load Balancing
    • Application Orchestration
      • Create Application
      • Chatbot Application
        • Multiple Model Debugging
      • Agent
      • Application Toolkits
        • Moderation Tool
    • Workflow
      • Key Concepts
      • Variables
      • Node Description
        • Start
        • End
        • Answer
        • LLM
        • Knowledge Retrieval
        • Question Classifier
        • Conditional Branch IF/ELSE
        • Code Execution
        • Template
        • Doc Extractor
        • List Operator
        • Variable Aggregator
        • Variable Assigner
        • Iteration
        • Parameter Extraction
        • HTTP Request
        • Agent
        • Tools
        • Loop
      • Shortcut Key
      • Orchestrate Node
      • File Upload
      • Error Handling
        • Predefined Error Handling Logic
        • Error Type
      • Additional Features
      • Debug and Preview
        • Preview and Run
        • Step Run
        • Conversation/Run Logs
        • Checklist
        • Run History
      • Application Publishing
      • Structured Outputs
      • Bulletin: Image Upload Replaced by File Upload
    • Knowledge
      • Create Knowledge
        • 1. Import Text Data
          • 1.1 Import Data from Notion
          • 1.2 Import Data from Website
        • 2. Choose a Chunk Mode
        • 3. Select the Indexing Method and Retrieval Setting
      • Manage Knowledge
        • Maintain Documents
        • Maintain Knowledge via API
      • Metadata
      • Integrate Knowledge Base within Application
      • Retrieval Test / Citation and Attributions
      • Knowledge Request Rate Limit
      • Connect to an External Knowledge Base
      • External Knowledge API
    • Tools
      • Quick Tool Integration
      • Advanced Tool Integration
      • Tool Configuration
        • Google
        • Bing
        • SearchApi
        • StableDiffusion
        • Dall-e
        • Perplexity Search
        • AlphaVantage
        • Youtube
        • SearXNG
        • Serper
        • SiliconFlow (Flux AI Supported)
        • ComfyUI
    • Publishing
      • Publish as a Single-page Web App
        • Web App Settings
        • Text Generator Application
        • Conversation Application
      • Embedding In Websites
      • Developing with APIs
      • Re-develop Based on Frontend Templates
    • Annotation
      • Logs and Annotation
      • Annotation Reply
    • Monitoring
      • Data Analysis
      • Integrate External Ops Tools
        • Integrate LangSmith
        • Integrate Langfuse
        • Integrate Opik
    • Extension
      • API-Based Extension
        • External Data Tool
        • Deploy API Tools with Cloudflare Workers
        • Moderation
      • Code-Based Extension
        • External Data Tool
        • Moderation
    • Collaboration
      • Discover
      • Invite and Manage Members
    • Management
      • App Management
      • Team Members Management
      • Personal Account Management
      • Subscription Management
      • Version Control
  • Workshop
    • Basic
      • How to Build an AI Image Generation App
    • Intermediate
      • Build An Article Reader Using File Upload
      • Building a Smart Customer Service Bot Using a Knowledge Base
      • Generating analysis of Twitter account using Chatflow Agent
  • Community
    • Seek Support
    • Become a Contributor
    • Contributing to Dify Documentation
  • Plugins
    • Introduction
    • Quick Start
      • Install and Use Plugins
      • Develop Plugins
        • Initialize Development Tools
        • Tool Plugin
        • Model Plugin
          • Create Model Providers
          • Integrate the Predefined Model
          • Integrate the Customizable Model
        • Agent Strategy Plugin
        • Extension Plugin
        • Bundle
      • Debug Plugin
    • Manage Plugins
    • Schema Specification
      • Manifest
      • Endpoint
      • Tool
      • Agent
      • Model
        • Model Designing Rules
        • Model Schema
      • General Specifications
      • Persistent Storage
      • Reverse Invocation of the Dify Service
        • App
        • Model
        • Tool
        • Node
    • Best Practice
      • Develop a Slack Bot Plugin
      • Dify MCP Plugin Guide: Connect Zapier and Automate Email Delivery with Ease
    • Publish Plugins
      • Publish Plugins Automatically
      • Publish to Dify Marketplace
        • Plugin Developer Guidelines
        • Plugin Privacy Protection Guidelines
      • Publish to Your Personal GitHub Repository
      • Package the Plugin File and Publish it
      • Signing Plugins for Third-Party Signature Verification
    • FAQ
  • Development
    • Backend
      • DifySandbox
        • Contribution Guide
    • Models Integration
      • Integrate Open Source Models from Hugging Face
      • Integrate Open Source Models from Replicate
      • Integrate Local Models Deployed by Xinference
      • Integrate Local Models Deployed by OpenLLM
      • Integrate Local Models Deployed by LocalAI
      • Integrate Local Models Deployed by Ollama
      • Integrate Models on LiteLLM Proxy
      • Integrating with GPUStack for Local Model Deployment
      • Integrating AWS Bedrock Models (DeepSeek)
    • Migration
      • Migrating Community Edition to v1.0.0
  • Learn More
    • Use Cases
      • DeepSeek & Dify Integration Guide: Building AI Applications with Multi-Turn Reasoning
      • Private Deployment of Ollama + DeepSeek + Dify: Build Your Own AI Assistant
      • Build a Notion AI Assistant
      • Create a MidJourney Prompt Bot with Dify
      • Create an AI Chatbot with Business Data in Minutes
      • Integrating Dify Chatbot into Your Wix Website
      • How to connect with AWS Bedrock Knowledge Base?
      • Building the Dify Scheduler
      • Building an AI Thesis Slack Bot on Dify
    • Extended Reading
      • What is LLMOps?
      • Retrieval-Augmented Generation (RAG)
        • Hybrid Search
        • Re-ranking
        • Retrieval Modes
      • How to Use JSON Schema Output in Dify?
    • FAQ
      • Self-Host
      • LLM Configuration and Usage
      • Plugins
  • Policies
    • Open Source License
    • User Agreement
      • Terms of Service
      • Privacy Policy
      • Get Compliance Report
  • Features
    • Workflow
Powered by GitBook
On this page
  • What is Langfuse
  • How to Configure Langfuse
  • Viewing Monitoring Data in Langfuse
  • List of monitoring data
  1. Guides
  2. Monitoring
  3. Integrate External Ops Tools

Integrate Langfuse

PreviousIntegrate LangSmithNextIntegrate Opik

Last updated 4 months ago

What is Langfuse

Langfuse is an open-source LLM engineering platform that helps teams collaborate on debugging, analyzing, and iterating their applications.

Introduction to Langfuse:


How to Configure Langfuse

  1. Register and log in to Langfuse on the

  2. Create a project in Langfuse. After logging in, click New on the homepage to create your own project. The project will be used to associate with applications in Dify for data monitoring.

Edit a name for the project.

  1. Create project API credentials. In the left sidebar of the project, click Settings to open the settings.

In Settings, click Create API Keys to create project API credentials.

Copy and save the Secret Key, Public Key, and Host.

  1. Configure Langfuse in Dify. Open the application you need to monitor, open Monitoring in the side menu, and select Tracing app performance on the page.

After clicking configure, paste the Secret Key, Public Key, Host created in Langfuse into the configuration and save.

Once successfully saved, you can view the status on the current page. If it shows as started, it is being monitored.


Viewing Monitoring Data in Langfuse

After configuration, debugging or production data of the application in Dify can be viewed in Langfuse.


List of monitoring data

Trace the information of Workflow and Chatflow

Tracing workflow and chatflow

Workflow
LangFuse Trace

workflow_app_log_id/workflow_run_id

id

user_session_id

user_id

workflow_{id}

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

Model token consumption

usage

metadata

metadata

error

level

error

status_message

[workflow]

tags

["message", conversation_mode]

session_id

conversion_id

parent_observation_id

Workflow Trace Info

  • workflow_id - Unique ID of Workflow

  • conversation_id - Conversation ID

  • workflow_run_id - Workflow ID of this runtime

  • tenant_id - Tenant ID

  • elapsed_time - Elapsed time at this runtime

  • status - Runtime status

  • version - Workflow version

  • total_tokens - Total token used at this runtime

  • file_list - List of files processed

  • triggered_from - Source that triggered this runtime

  • workflow_run_inputs - Input of this workflow

  • workflow_run_outputs - Output of this workflow

  • error - Error Message

  • query - Queries used at runtime

  • workflow_app_log_id - Workflow Application Log ID

  • message_id - Relevant Message ID

  • start_time - Start time of this runtime

  • end_time - End time of this runtime

  • workflow node executions - Workflow node runtime information

  • Metadata

    • workflow_id - Unique ID of Workflow

    • conversation_id - Conversation ID

    • workflow_run_id - Workflow ID of this runtime

    • tenant_id - Tenant ID

    • elapsed_time - Elapsed time at this runtime

    • status - Operational state

    • version - Workflow version

    • total_tokens - Total token used at this runtime

    • file_list - List of files processed

    • triggered_from - Source that triggered this runtime

Message Trace Info

For trace llm conversation

Message
LangFuse Generation/Trace

message_id

id

user_session_id

user_id

message_{id}

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

Model token consumption

usage

metadata

metadata

error

level

error

status_message

["message", conversation_mode]

tags

conversation_id

session_id

conversion_id

parent_observation_id

Message Trace Info

  • message_id - Message ID

  • message_data - Message data

  • user_session_id - Session ID for user

  • conversation_model - Conversation model

  • message_tokens - Message tokens

  • answer_tokens - Answer Tokens

  • total_tokens - Total Tokens from Message and Answer

  • error - Error Message

  • inputs - Input data

  • outputs - Output data

  • file_list - List of files processed

  • start_time - Start time

  • end_time - End time

  • message_file_data - Message of relevant file data

  • conversation_mode - Conversation mode

  • Metadata

    • conversation_id - Conversation ID

    • ls_provider - Model provider

    • ls_model_name - Model ID

    • status - Message status

    • from_end_user_id - Sending user's ID

    • from_account_id - Sending account's ID

    • agent_based - Whether agent based

    • workflow_run_id - Workflow ID of this runtime

    • from_source - Message source

    • message_id - Message ID

Moderation Trace Information

Used to track conversation moderation

Moderation
LangFuse Generation/Trace

user_id

user_id

moderation

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

metadata

metadata

[moderation]

tags

message_id

parent_observation_id

Message Trace Info

  • message_id - Message ID

  • user_id - user ID

  • workflow_app_log_id workflow_app_log_id

  • inputs - Input data for review

  • message_data - Message Data

  • flagged - Whether it is flagged for attention

  • action - Specific actions to implement

  • preset_response - Preset response

  • start_time - Start time of review

  • end_time - End time of review

  • Metadata

    • message_id - Message ID

    • action - Specific actions to implement

    • preset_response - Preset response

Suggested Question Trace Information

Used to track suggested questions

Suggested Question
LangFuse Generation/Trace

user_id

user_id

suggested_question

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

metadata

metadata

[suggested_question]

tags

message_id

parent_observation_id

Message Trace Info

  • message_id - Message ID

  • message_data - Message data

  • inputs - Input data

  • outputs - Output data

  • start_time - Start time

  • end_time - End time

  • total_tokens - Total tokens

  • status - Message Status

  • error - Error Message

  • from_account_id - Sending account ID

  • agent_based - Whether agent based

  • from_source - Message source

  • model_provider - Model provider

  • model_id - Model ID

  • suggested_question - Suggested question

  • level - Status level

  • status_message - Message status

  • Metadata

    • message_id - Message ID

    • ls_provider - Model Provider

    • ls_model_name - Model ID

    • status - Message status

    • from_end_user_id - Sending user's ID

    • from_account_id - Sending Account ID

    • workflow_run_id - Workflow ID of this runtime

    • from_source - Message source

Dataset Retrieval Trace Information

Used to track knowledge base retrieval

Dataset Retrieval
LangFuse Generation/Trace

user_id

user_id

dataset_retrieval

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

metadata

metadata

[dataset_retrieval]

tags

message_id

parent_observation_id

Dataset Retrieval Trace Info

  • message_id - Message ID

  • inputs - Input Message

  • documents - Document data

  • start_time - Start time

  • end_time - End time

  • message_data - Message data

  • Metadata

    • message_id - Message ID

    • ls_provider - Model Provider

    • ls_model_name - Model ID

    • status - Model status

    • from_end_user_id - Sending user's ID

    • from_account_id - Sending account's ID

    • agent_based - Whether agent based

    • workflow_run_id - Workflow ID of this runtime

    • from_source - Message Source

Tool Trace Information

Used to track tool invocation

Tool
LangFuse Generation/Trace

user_id

user_id

tool_name

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

metadata

metadata

["tool", tool_name]

tags

message_id

parent_observation_id

Tool Trace Info

  • message_id - Message ID

  • tool_name - Tool Name

  • start_time - Start time

  • end_time - End time

  • tool_inputs - Tool inputs

  • tool_outputs - Tool outputs

  • message_data - Message data

  • error - Error Message,if exist

  • inputs - Input of Message

  • outputs - Output of Message

  • tool_config - Tool config

  • time_cost - Time cost

  • tool_parameters - Tool Parameters

  • file_url - URL of relevant files

  • Metadata

    • message_id - Message ID

    • tool_name - Tool Name

    • tool_inputs - Tool inputs

    • tool_outputs - Tool outputs

    • tool_config - Tool config

    • time_cost - Time. cost

    • error - Error Message

    • tool_parameters - Tool parameters

    • message_file_id - Message file ID

    • created_by_role - Created by role

    • created_user_id - Created user ID

Generate Name Trace

Used to track conversation title generation

Generate Name
LangFuse Generation/Trace

user_id

user_id

generate_name

name

start_time

start_time

end_time

end_time

inputs

input

outputs

output

metadata

metadata

[generate_name]

tags

Generate Name Trace Info

  • conversation_id - Conversation ID

  • inputs - Input data

  • outputs - Generated session name

  • start_time - Start time

  • end_time - End time

  • tenant_id - Tenant ID

  • Metadata

    • conversation_id - Conversation ID

    • tenant_id - Tenant ID

https://langfuse.com/
official website
Create a project in Langfuse
Create a project in Langfuse
Create project API credentials
Create project API credentials
Get API Key configuration
Configure Langfuse
Configure Langfuse
View configuration status
Debugging Applications in Dify
Viewing application data in Langfuse