MATLAB Answers

0

why does a filter change the amplitude of a signal ?

Asked by pavan sunder on 2 Feb 2016
Latest activity Answered by Star Strider
on 2 Feb 2016
when i filtered a signal with a cutoff equal to the desired range. i got the signal of desired frequency but the amplitude was altered. can anybody explain what is the reason ?
i assume since we are giving the signal ( represented as combination of complex exponential) as input to the filter the output is scaled as we know that complex exponentials are eigen functions.. Is this justification correct ?

  1 Comment

Why do you think it shouldn't. For example if you blur a signal or image, you'll make the bright parts dimmer, because it was blurred. It makes sense.

Sign in to comment.

Tags

1 Answer

Answer by Star Strider
on 2 Feb 2016

It depends on the filter design you choose. A filter with a more gradual rolloff (such as a Butterworth design) could attenuate your signal if the rolloff was too gradual. Increasing the filter order could decrease or eliminate the attenuation, however this would also lengthen the filter. A Chebyshev design with a sharper cutoff could also minimise the passband attenuation.

  0 Comments

Sign in to comment.