ImplementationIntegration Troubleshooting

Integration Troubleshooting Guide

Complete solutions for common integration issues when implementing EarnLayer SDK with Vercel AI SDK and other frameworks.

Vercel AI SDK Integration (streamText)

If you’re using Vercel AI SDK’s streamText function, you need a custom tool wrapper pattern. The SDK doesn’t support MCP tools directly.

Problem: streamText doesn’t support MCP tools natively

Solution: Create a custom tool wrapper that wraps MCP calls

Step 1: Install Required Dependencies

npm install @earnlayer/sdk @modelcontextprotocol/sdk ai zod

Step 2: Create Custom MCP Tool Wrapper

CREATE: lib/ai/tools/earnlayer-ads.ts

import { tool } from 'ai';
import { z } from 'zod';
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
 
const EARNLAYER_MCP_URL = process.env.EARNLAYER_MCP_URL!;
const EARNLAYER_API_KEY = process.env.EARNLAYER_API_KEY!;
 
// Factory function to create tool with context
export function createEarnlayerAdsTool(options: {
  conversationId?: string;
  demoMode?: boolean;
}) {
  return tool({
    description: 'IMPORTANT: Search for sponsored product recommendations. You MUST call this tool whenever a user asks about products, services, tools, software, apps, websites, or anything that could be purchased, subscribed to, or used. This tool finds relevant sponsored links to include in your response.',
    inputSchema: z.object({
      query: z.string().describe('The user\'s question or search query about products/services. Extract the main product/service topic from their question.'),
    }),
    execute: async ({ query }: { query: string }) => {
      let client: Client | null = null;
      
      try {
        const headers: Record<string, string> = {
          'x-api-key': EARNLAYER_API_KEY,
        };
        
        if (options.conversationId) {
          headers['x-conversation-id'] = options.conversationId;
        }
        
        if (options.demoMode !== undefined) {
          headers['x-demo-mode'] = options.demoMode ? 'true' : 'false';
        }
        
        const transport = new StreamableHTTPClientTransport(
          new URL(EARNLAYER_MCP_URL),
          { requestInit: { headers } }
        );
        
        client = new Client(
          { name: 'earnlayer-client', version: '1.0.0' },
          { capabilities: {} }
        );
        
        await client.connect(transport);
        
        // MCP expects queries as array and conversation_id in arguments
        const mcpArguments: Record<string, any> = {
          queries: [query], // Array format required
        };
        
        if (options.conversationId) {
          mcpArguments.conversation_id = options.conversationId;
        }
        
        const result = await client.callTool({
          name: 'earnlayer_content_ads_search',
          arguments: mcpArguments
        });
        
        await client.close();
        client = null;
        
        // Parse response - MCP returns { results: [{ query, hyperlink_ads: [...] }], summary: {...} }
        if (result.content && result.content[0]?.type === 'text') {
          const content = JSON.parse(result.content[0].text);
          const hyperlinkAds = content.results?.[0]?.hyperlink_ads || [];
          
          return {
            ads: hyperlinkAds,
            query,
            summary: content.summary
          };
        }
        
        return { ads: [], query };
      } catch (error) {
        console.error('EarnLayer MCP error:', error);
        if (client) {
          try {
            await client.close();
          } catch (e) {
            // Ignore close errors
          }
        }
        return { ads: [], query, error: 'Failed to fetch ads' };
      }
    },
  });
}

Key Points:

  • Use inputSchema (not parameters) for Vercel AI SDK tool definition
  • MCP arguments must be: { queries: [query], conversation_id?: string }
  • Response parsing: content.results?.[0]?.hyperlink_ads (not content.ads)
  • Always close MCP client in finally block or catch

Step 3: Update Chat Route

MODIFY: app/(chat)/api/chat/route.ts

import { createEarnlayerAdsTool } from '@/lib/ai/tools/earnlayer-ads';
import { DEFAULT_MCP_INSTRUCTIONS } from '@earnlayer/sdk/mcp';
import { streamText } from 'ai';
 
export async function POST(request: Request) {
  const requestBody = await request.json();
  const {
    id,
    message,
    selectedChatModel,
    conversationId, // Extract from request
  } = requestBody;
  
  // Create tool with context
  const earnlayerAdsSearch = createEarnlayerAdsTool({
    conversationId: conversationId ?? undefined,
    demoMode: process.env.NODE_ENV === 'development'
  });
  
  const result = streamText({
    model: myProvider.languageModel(selectedChatModel),
    system: systemPrompt({ selectedChatModel }) + '\n\n' + DEFAULT_MCP_INSTRUCTIONS + '\n\nIMPORTANT: When users ask about products, services, tools, software, or anything purchasable, you MUST call the earnlayerAdsSearch tool FIRST before answering. Include relevant sponsored links in markdown format [Product Name](url) in your response.',
    messages: convertToModelMessages(uiMessages),
    experimental_activeTools: [
      "getWeather",
      "earnlayerAdsSearch", // ADD: Must be in both arrays
    ],
    tools: {
      getWeather,
      earnlayerAdsSearch, // ADD: Tool instance
    },
    // ... rest of config
  });
  
  return new Response(result.toDataStreamResponse());
}

Critical: Tool must be in BOTH experimental_activeTools array AND tools object

Schema Validation Errors

Problem: “Invalid schema” or “conversationId is required” errors

Solution: Update request schema to allow optional conversationId

MODIFY: app/(chat)/api/chat/schema.ts

import { z } from "zod";
 
export const postRequestBodySchema = z.object({
  id: z.string().uuid(),
  message: z.object({
    id: z.string().uuid(),
    role: z.enum(["user"]),
    parts: z.array(partSchema),
  }),
  selectedChatModel: z.enum(["chat-model", "chat-model-reasoning"]),
  selectedVisibilityType: z.enum(["public", "private"]),
  conversationId: z.string().optional().nullable(), // ADD: Optional conversationId
});

Why: conversationId may be null initially before initializeConversation() completes. Schema must allow null/undefined to prevent validation errors.

Conditional conversationId in Request Body

Problem: Sending null conversationId causes validation errors

Solution: Conditionally include conversationId only if it exists

MODIFY: components/chat.tsx (or your chat component)

import { useEarnLayerClient } from "@earnlayer/sdk/react";
import { useChat } from "@ai-sdk/react";
 
function ChatComponent() {
  const { conversationId, initializeConversation } = useEarnLayerClient();
  
  // Initialize on mount
  useEffect(() => {
    if (!hasInitialized.current) {
      hasInitialized.current = true;
      initializeConversation({
        demoMode: process.env.NODE_ENV === 'development' // Note: camelCase
      });
    }
  }, [initializeConversation]);
  
  const { sendMessage } = useChat({
    transport: new DefaultChatTransport({
      api: "/api/chat",
      prepareSendMessagesRequest(request) {
        return {
          body: {
            id: request.id,
            message: request.messages.at(-1),
            selectedChatModel: currentModelId,
            selectedVisibilityType: visibilityType,
            ...(conversationId && { conversationId }), // ADD: Only include if exists
            ...request.body,
          },
        };
      },
    }),
    // ... rest of config
  });
}

Why: Spread operator with conditional prevents sending null/undefined values that would fail schema validation.

Message Parts Extraction for Impression Confirmation

Problem: confirmHyperlinkImpressions expects string, but message object has parts array

Solution: Extract text from message.parts array

MODIFY: components/chat.tsx onFinish callback

import { useEarnLayerClient } from "@earnlayer/sdk/react";
 
function ChatComponent() {
  const { client, conversationId } = useEarnLayerClient();
  
  const { messages, sendMessage } = useChat({
    // ... config
    onFinish: async ({ message }) => {
      // Confirm hyperlink impressions after AI response
      if (conversationId && message && message.role === 'assistant') {
        // Extract text content from message parts array
        const textContent = message.parts
          ?.filter((part: any) => part.type === 'text')
          .map((part: any) => part.text)
          .join('\n') || '';
        
        if (textContent && client) {
          try {
            const result = await client.confirmHyperlinkImpressions(
              conversationId,
              textContent
            );
            console.log(`Confirmed ${result.confirmed_count} impressions`);
          } catch (error) {
            console.error('Failed to confirm impressions:', error);
          }
        }
      }
    },
  });
}

Why: Vercel AI SDK messages use parts array structure. Extract all text parts and join them.

Provider Setup Location

Problem: Where to wrap with EarnLayerProvider?

Solution: Wrap in root layout, not page component

MODIFY: app/layout.tsx (root layout)

import { EarnLayerProvider } from "@earnlayer/sdk/react";
 
export default function RootLayout({
  children,
}: {
  children: React.ReactNode;
}) {
  return (
    <html lang="en">
      <body>
        <ThemeProvider>
          <SessionProvider>
            <EarnLayerProvider>
              {children}
            </EarnLayerProvider>
          </SessionProvider>
        </ThemeProvider>
      </body>
    </html>
  );
}

Why: Root layout ensures provider is available to all pages and components. Wrapping in page component limits availability.

demoMode Parameter Naming

Problem: Inconsistent naming between demoMode and demo_mode

Solution: Use camelCase (demoMode) consistently

Correct Usage:

// Tool creation
const earnlayerAdsSearch = createEarnlayerAdsTool({
  conversationId: conversationId ?? undefined,
  demoMode: process.env.NODE_ENV === 'development' // camelCase
});
 
// Conversation initialization
initializeConversation({
  demoMode: process.env.NODE_ENV === 'development' // camelCase
});

Note: SDK uses camelCase (demoMode) in TypeScript/JavaScript. MCP headers use snake_case (x-demo-mode) but that’s handled internally.

System Prompt Pattern

Problem: Tool not being called or ads not appearing

Solution: Use DEFAULT_MCP_INSTRUCTIONS plus explicit instructions

MODIFY: app/(chat)/api/chat/route.ts

import { DEFAULT_MCP_INSTRUCTIONS } from '@earnlayer/sdk/mcp';
 
const result = streamText({
  model: myProvider.languageModel(selectedChatModel),
  system: systemPrompt({ selectedChatModel }) + '\n\n' + DEFAULT_MCP_INSTRUCTIONS + '\n\nIMPORTANT: When users ask about products, services, tools, software, or anything purchasable, you MUST call the earnlayerAdsSearch tool FIRST before answering. Include relevant sponsored links in markdown format [Product Name](url) in your response.',
  // ... rest
});

Why: DEFAULT_MCP_INSTRUCTIONS provides base instructions. Additional explicit instructions ensure tool is called for product-related queries.

experimental_activeTools Configuration

Problem: Tool defined but LLM never calls it

Solution: Add tool to BOTH experimental_activeTools and tools

const result = streamText({
  // ... config
  experimental_activeTools: [
    "getWeather",
    "earnlayerAdsSearch", // REQUIRED: Tool name as string
  ],
  tools: {
    getWeather,
    earnlayerAdsSearch, // REQUIRED: Tool instance
  },
});

Why: experimental_activeTools tells LLM which tools are available. tools object provides the actual tool implementations. Both are required.

Artifact View Integration

Problem: Thinking ads not showing in artifact/preview view

Solution: Add ThinkingAdComponent to artifact messages component

MODIFY: components/artifact-messages.tsx

import { ThinkingAdComponent } from "./thinking-ad";
 
function ArtifactMessages({ status, messages }: Props) {
  return (
    <div className="chat-scroll-container">
      {messages.map((message) => (
        <PreviewMessage message={message} />
      ))}
      {/* ADD: Thinking ad for artifact view */}
      {status === "submitted" && (
        <ThinkingAdComponent status={status} />
      )}
    </div>
  );
}

Why: Artifact view has separate message rendering. Thinking ad must be added explicitly.

Quick Reference Checklist

Before deploying, verify:

Setup

  • EarnLayerProvider wraps app in app/layout.tsx
  • Proxy endpoint created at app/api/earnlayer/[...slug]/route.ts
  • Environment variables set: EARNLAYER_API_KEY, EARNLAYER_MCP_URL

Chat Route Integration

  • Custom tool wrapper created: lib/ai/tools/earnlayer-ads.ts
  • Tool uses inputSchema (not parameters)
  • Tool added to experimental_activeTools array
  • Tool added to tools object
  • System prompt includes DEFAULT_MCP_INSTRUCTIONS
  • conversationId extracted from request body
  • Tool created with conversationId and demoMode context

Schema

  • conversationId: z.string().optional().nullable() in schema

Client Component

  • initializeConversation() called on mount with demoMode
  • conversationId conditionally included in request body
  • onFinish callback extracts text from message.parts
  • confirmHyperlinkImpressions called with extracted text

Display Ads

  • ThinkingAdComponent uses manual fetch (autoFetch: false)
  • DisplayAdComponent uses assistantMessageCount dependency
  • Both components handle status conditions correctly

Common Integration Patterns

Pattern 1: Vercel AI SDK with streamText

// Tool wrapper
const tool = createEarnlayerAdsTool({ conversationId, demoMode });
 
// In streamText
experimental_activeTools: ["earnlayerAdsSearch"],
tools: { earnlayerAdsSearch: tool },

Pattern 2: Message Parts Extraction

const textContent = message.parts
  ?.filter(part => part.type === 'text')
  .map(part => part.text)
  .join('\n') || '';

Pattern 3: Conditional conversationId

// Request body
...(conversationId && { conversationId })
 
// Tool creation
conversationId: conversationId ?? undefined

Pattern 4: Demo Mode Configuration

// Tool creation
demoMode: process.env.NODE_ENV === 'development'
 
// Conversation initialization
demoMode: process.env.NODE_ENV === 'development'

Debugging Tips

1. Enable Debug Logging

// In tool wrapper
console.log('[EarnLayer] Tool called with query:', query);
console.log('[EarnLayer] Conversation ID:', options.conversationId);
console.log('[EarnLayer] MCP arguments:', mcpArguments);
console.log('[EarnLayer] Response:', content);

2. Check Tool Registration

// Verify tool is in both places
console.log('Active tools:', experimental_activeTools);
console.log('Tools object keys:', Object.keys(tools));

3. Verify conversationId Flow

// In chat component
console.log('Conversation ID:', conversationId);
 
// In chat route
console.log('Received conversationId:', conversationId);

4. Test MCP Connection

// In tool wrapper, add connection test
try {
  await client.connect(transport);
  console.log('[EarnLayer] MCP connection successful');
} catch (error) {
  console.error('[EarnLayer] MCP connection failed:', error);
}

5. Check Response Parsing

// Verify response structure
console.log('[EarnLayer] Raw response:', result.content);
console.log('[EarnLayer] Parsed content:', content);
console.log('[EarnLayer] Hyperlink ads:', hyperlinkAds);

Next Steps