본문 바로가기
프로그램

[파이썬] Gaussian Filter기반의 Time-series Denoising(노이즈 제거)

by 오디세이99 2023. 7. 18.
728x90
반응형
import numpy as np
from scipy.ndimage import gaussian_filter
from sklearn.preprocessing import MinMaxScaler
import matplotlib.pyplot as plt

# Generate clean sine wave
data_size = 2000  # Increased data size
t = np.linspace(0, 10, data_size)
clean_data = np.sin(t)

# Add uniform noise
noise = np.random.uniform(-0.1, 0.1, data_size)
noisy_data = clean_data + noise

# Normalize data
scaler = MinMaxScaler(feature_range=(0, 1))  # Normalization to [0, 1]
noisy_data = scaler.fit_transform(noisy_data.reshape(-1, 1))

# Reshape the data to a 2D array (necessary for the convolutional autoencoder)
noisy_data = noisy_data.reshape((-1, 1))

# Apply Gaussian filter for noise removal
sigma = 5  # Standard deviation for Gaussian filter
filtered_data = gaussian_filter(noisy_data, sigma)

# Rescale filtered data back to the original range
filtered_data = scaler.inverse_transform(filtered_data)

# Reshape filtered data back to 1D array
filtered_data = filtered_data.flatten()

# Plotting
plt.figure(figsize=(10, 6))
plt.plot(t, clean_data, label='Clean Data')
plt.plot(t, noisy_data, label='Noisy Data')
plt.plot(t, filtered_data, label='Filtered Data')
plt.title('Gaussian Filter for Noise Removal')
plt.xlabel('Time')
plt.ylabel('Amplitude')
plt.legend()
plt.show()

728x90
반응형

댓글