You are given three independent coding prompts. For each prompt, clearly define your function/class interfaces, handle edge cases, and analyze time/space complexity.
Given two non-decreasing integer arrays:
nums1
of length
m + n
, where the first
m
elements are valid and the last
n
elements are placeholders.
nums2
of length
n
.
Merge nums2 into nums1 so that nums1 becomes a single sorted array.
Inputs
nums1: int[]
,
m: int
nums2: int[]
,
n: int
Output
nums1
in-place.
Constraints (typical interview scale)
0 ≤ m, n ≤ 2e5
A
,
B
,
C
) into one sorted array?
Implement a round-robin CPU/task scheduler with a fixed time quantum q.
Each task has:
id
(string/int)
arrivalTime
(non-negative integer)
burstTime
(positive integer total runtime required)
Scheduling rules:
arrivalTime
.
min(q, remainingTime)
.
Output Return the execution trace as a list of segments like:
(taskId, startTime, endTime)
Include idle time segments if the CPU is idle.
Design and implement a simplified in-memory trading simulator that processes a time-ordered stream of events.
Each event is one of:
DEPOSIT amount
BUY symbol qty price
SELL symbol qty price
Rules:
BUY
decreases cash by
qty * price
and increases position for
symbol
by
qty
.
SELL
increases cash by
qty * price
and decreases position by
qty
.
BUY
that would make cash negative.
SELL
that would make the position for that symbol negative.
Output After processing all events, return:
Follow-ups (optional, if time allows)