Code: Select all
_line = new PointLineSeries();
_line.LineStyle.Color = Color.Red;
_line.Points = new SeriesPoint[3];
_line.Points[0].X = 0;
_line.Points[0].Y = 0;
_line.Points[1].X = 50;
_line.Points[1].Y = 50;
_line.Points[2].X = 5000;
_line.Points[2].Y = 5000;
_view.PointLineSeries.Add(_line);
double dx = 0.0001, dy = 0.0001;
_view.XAxes[0].SetRange(50 - dx, 50 + dx);
_view.YAxes[0].SetRange(50 - dy, 50 + dy);
It seems there is an issue when the difference between the axis' view range and the PointLineSeries data points gets too great. Is this a rounding/numeric issue?
Or am I doing something wrong?
You will find the complete project attached.