Loading problems...
TL;DR: The optimal solution utilizes Depth-First Search (DFS) combined with a Hash Map to create a deep copy of each node exactly once, using the map to handle cycles and prevent infinite recursion.
The Clone Graph problem asks us to create a deep copy of a connected undirected graph. We are given a reference to a single node in the graph. A "deep copy" means we must create entirely new instances for every node and reconstruct the exact same edge structure (neighbor connections) as the original graph. The new graph must not contain any references to nodes from the original graph.
This is a classic graph problem, often referred to as LeetCode 133, and serves as a fundamental test of graph traversal and object referencing skills.
A naive approach might attempt to traverse the graph using recursion, creating a new copy of the current node and then immediately recursively calling the function for all its neighbors.
python# Pseudo-code for Naive Approach
This approach fails critically because undirected graphs often contain cycles (e.g., A connects to B, and B connects back to A).
clone(A) calls clone(B), clone(B) will see A as a neighbor and call clone(A) again. This creates an infinite loop, resulting in a Stack Overflow Error (Time Limit Exceeded or Runtime Error).The core difficulty in Clone Graph is handling cycles and shared neighbors. The graph is undirected, meaning if node is a neighbor of , then is a neighbor of .
To solve this efficiently, we need a mechanism to "remember" which nodes have already been cloned. This leads to the primary insight: Use a Hash Map.
The Hash Map serves two purposes:
Original Node (key) to the Cloned Node (value). If we encounter a node that is already in the map, we simply return the reference to the existing clone rather than creating a new one.Visualizing the Process:
Imagine the algorithm as a web crawler. When the recursion visits a node (e.g., Node 1), it first checks its notebook (the Hash Map). If Node 1 is not in the notebook, it creates "Clone 1", writes down 1 -> Clone 1, and then proceeds to visit Node 1's neighbors. When it eventually loops back to Node 1 via a neighbor, it checks the notebook, sees 1 -> Clone 1 exists, and simply links the edge to "Clone 1" instead of restarting the work.

Let's trace the algorithm on a simple graph: 1 -- 2.
Call cloneGraph(Node 1):
Clone 1.{Node 1: Clone 1}.[Node 2].Recursive Call cloneGraph(Node 2):
Clone 2.{Node 1: Clone 1, Node 2: Clone 2}.[Node 1].Recursive Call cloneGraph(Node 1) (from inside Node 2's context):
Clone 1.Back in cloneGraph(Node 2):
Clone 1 to Clone 2.neighbors.Clone 2.Back in cloneGraph(Node 1):
Clone 2 to Clone 1.neighbors.Clone 1.The result is a new graph structure Clone 1 -- Clone 2 that mirrors the original.
The correctness relies on the invariant enforced by the Hash Map: Every node in the original connected component is instantiated exactly once.
new Node(val) for every unique key in the map, ensuring no references to the original graph persist in the returned structure.cpp/* // Definition for a Node. class Node { public: int val; vector<Node*> neighbors; Node() { val = 0; neighbors = vector<Node*>(); } Node(int _val) { val = _val; neighbors = vector<Node*>(); } Node(int _val, vector<Node*> _neighbors) { val = _val; neighbors = _neighbors; } }; */ class Solution { private: // Map to store mapping from original node to cloned node unordered_map<Node*, Node*> visited; public: Node* cloneGraph(Node* node) { if (node == nullptr) { return nullptr; } // If the node is already visited, return the cloned instance if (visited.find(node) != visited.end()) { return visited[node]; } // Create the clone for the current node Node* clone = new Node(node->val); // Add to map immediately to handle cycles visited[node] = clone; // Iterate through neighbors and clone them recursively for (Node* neighbor : node->neighbors) { clone->neighbors.push_back(cloneGraph(neighbor)); } return clone; } };
java/* // Definition for a Node. class Node { public int val; public List<Node> neighbors; public Node() { val = 0; neighbors = new ArrayList<Node>(); } public Node(int _val) { val = _val; neighbors = new ArrayList<Node>(); } public Node(int _val, ArrayList<Node> _neighbors) { val = _val; neighbors = _neighbors; } } */ class Solution { // HashMap to keep track of visited nodes and their clones private HashMap<Node, Node> visited = new HashMap<>(); public Node cloneGraph(Node node) { if (node == null) { return null; } // If the node was already cloned, return the reference if (visited.containsKey(node)) { return visited.get(node); } // Create a new node (clone) Node clone = new Node(node.val); // Store the clone in the map immediately visited.put(node, clone); // Recursively clone neighbors for (Node neighbor : node.neighbors) { clone.neighbors.add(cloneGraph(neighbor)); } return clone; } }
python""" # Definition for a Node. class Node: def __init__(self, val = 0, neighbors = None): self.val = val self.neighbors = neighbors if neighbors is not None else [] """ class Solution: def __init__(self): # Dictionary to save the visited nodes. # Key: Original Node, Value: Cloned Node self.visited = {} def cloneGraph(self, node: 'Node') -> 'Node': if not node: return None # If the node is already visited, return the clone from the dictionary if node in self.visited: return self.visited[node] # Create a new node with the value of the original node clone_node = Node(node.val, []) # Add to visited dictionary immediately to prevent cycles self.visited[node] = clone_node # Iterate through the neighbors to generate their clones if node.neighbors: for neighbor in node.neighbors: clone_node.neighbors.append(self.cloneGraph(neighbor)) return clone_node
Time Complexity:
Space Complexity:
The Graph - Deep Copy / Cloning pattern and the use of a Hash Map during traversal are applicable to several other problems:
Mistake: Implementing DFS without a visited map.
Concept Gap: Failure to recognize that undirected graphs contain inherent cycles (A <-> B).
Consequence: The recursion never hits a base case, calling clone(A) -> clone(B) -> clone(A) infinitely until the stack limit is exceeded.
Mistake: Creating a new visited map inside the recursive function scope instead of making it global or passing it as an argument.
Concept Gap: Scope management in recursion.
Consequence: Each recursive call thinks it's starting fresh, losing track of previously cloned nodes. This breaks the connectivity and creates duplicate clones.
Mistake: Not handling the node == null check at the very beginning.
Concept Gap: Edge case handling.
Consequence: Attempting to access node.val or on a reference results in a Null Pointer Exception or AttributeError.
node.neighborsnullMistake: Doing clone.neighbors = new ArrayList<>(original.neighbors).
Concept Gap: Shallow copy vs. Deep copy.
Consequence: The new node will point to the original neighbor nodes, not the cloned neighbor nodes. Modifying the clone would corrupt the original graph structure.