游客发表

10115005-7SR090HLF_Datasheet PDF

发帖时间:2021-06-15 03:32:52

Maximum likelihood: more accurate demodulation Another area of performance improvement available to engineers is through advanced demodulation techniques for handling spatial multiplexing. As a result of the multiple transmit antennas required to implement MIMO technology, multiple data streams arrive at the receiving antennas overlaying each other. It becomes necessary for the MIMO equalizer to employ spatial multiplexing to separate the data streams from each other and restore them to their original independent signals. Conventional 802.11n systems use zero-forcing (ZF) MIMO equalization to implement spatial multiplexing. New maximum likelihood (ML) methods, however, offer greater accuracy.

One of the main challenges before the processor verification team is to come out with a pre-RTL, fast functional verification. Attempts are being made to raise the level of abstraction at the system level with further interest in ESL.

Another factor that can reduce the verification time considerably is by looking at the compliance problem. The notion of a framework for a successful verification plan and strategy stems from these pre-RTL attempts which can help in considerable time savings. However, there is ample space to expand the scope of framework to cover both in the realm of pre-RTL and post-RTL verification process. As the abstraction level is increased further there will be a greater need for compliance checking. A majority of verification bugs arise from wrong specification, communication problems and ambiguity, they said.

10115005-7SR090HLF_Datasheet PDF

With coverage data and analysis in the post-RTL scenario is a continuous area of development there are a slew of tools and techniques both in-house and external to large industry houses engaged in processor verification. Therefore, to reduce time-to-market the focus has to be on the integration of these tools and earlier reporting of failures. Our work for a framework is directed towards such considerations which can be helpful for reducing the overall verification time-frame.”

The paper was based on a project to build a good verification plan covering and integrating the tools, techniques and methodologies. It puts in perspective a strategy than can be further built upon and is based on the belief that a strategy and a framework will benefit everyone engaged in verification.The next level of higher efficiency in verification will come from the level of abstraction becoming higher and another from greater focus on compliance and even better coverage, with the latter offering adequate scope for research communities to engage in, they concluded.

Related Links:

10115005-7SR090HLF_Datasheet PDF

1. Viewpoint: Formal verification with constraints – it doesn’t have to be like tightrope walking

2.Verification is alive and well at SoC virtual conference

10115005-7SR090HLF_Datasheet PDF

3. Freescale, Synopsys broaden cooperation to cut IC verification cost

The following illustrates the steps a diagnosis tool goes through to analyze the actual response observed on the tester and determine which fault(s) might have caused this failure (Figure 1 ):

1. Trace back from each failing scan cell and identify which suspects (fault location and mechanism, such as stuck-at 0”) could have caused the failures.

2. Simulate the effect of each suspected fault to determine if the simulated effect matches the actual response. Eliminate the suspects that failed in the simulation, but passed in silicon.

3. Rank suspects to identify which faults are most likely to have caused the failure.

Figure 1: Scan diagnosis algorithm example where the tester observed unexpected results in two scan cells.

热门排行

友情链接